Nvidia GeForce RTX 3070 Review: The New $500 King

Again and again time is proving that the 2080Ti's asking price is absolute abysmal value. Makes me more and more satisfied that I never purchased that daylight-robbery card.
 
I took this from another site who uses I7-10700k. So we can compare for 2k: AC:Odyssey ultra- i7:78fps, your 3950x: 66fps, F1 2020 ultra- i7:170fps, 3950x:151, Horizon Zero Dawn ultimate- i7:105fps, 3950x:96fps, Wolfenstein Youngblood mein leiben! setting- i7:236fps and again your 3950x:199fps. It`s true in 4k the differences shrink, but this is suppose to be a 2k card. I hope the I9 benches come next, just to be fair.
It's hard to compare different results from different websites because of how they do their tests. Different settings, different portions of the game, etc. The CPU should not play a big role in this situation even at 1440p (aka 2.5k) with the exception of maybe 1 or 2 games that show more than 2-3% difference.

You should be looking at the general percentage differences, with the absolute values being there for system comparisons and other stuff.

For example, techpowerup, which uses an 9900k@5GHz with 4GHz RAM, found that the 3080 is 23% faster than the 3070 at 1440p.
Techspot's results quick math: 141fps + 23% = 173fps (the exact same percentage even though Techspot used slower 3200MHz RAM and a 3950x)

If techspot had used OCed memory (like 3600MHz CL14) and had maybe better cooling to get higher boost clocks with BPO enabled, you would have seen better absolute values. But it's not really realistic and not indicative of what people are doing in the real world.(even having a clean install of windows with no background tasks is not realistic)
 
Last edited:
Is it me or is Nvidia REALLY REALLY trying to steal Big Navi's thunder? Hope they know something we don't and that's why they're launch cards with practically zero stock.
It couldn't possibly be related to a worldwide pandemic causing both increased demand and supply-chain shortages. :neutral:
 
It couldn't possibly be related to a worldwide pandemic causing both increased demand and supply-chain shortages. :neutral:
The supply shortages are most likely a result of low yields from Samsung's 8nm process node. The Nvidia cards are huge and very complicated to make and they are also using a custom version of the process node (8N NVIDIA) for better performance for the GPUs (rumoured to at around 10% compared to normal 8nm).
 
You're limited by the memory controller address and current size of the memory that is manufactured.

You can't really do 12GB on a 256 bit bus card unless you start splitting up the controllers in a way that will probably just slow the GPU down anyway. Defeating the point. I don't think 8GB is that bad for this card, it's really aimed at 1440p anyway I feel.

The real concern here for me is what Nvidia do for the 3060. It seems inconceivable to me that it could be anything less than this same 256 bit bus and 8GB of memory. They can't go to a 192 bit bus again can they? Surely only on sub $300 parts.

However if it is less then that it is Nvidia market segmentation. 6GB of memory would not really be good on a new card over $300. I don't think they would do it.

I was wondering if that was the case because I never really researched how the controllers work best by factors of 2 vs any other way. That makes sense if that is the reason with the hardware in the way the memory subsystem is set up.
 
The Nvidia cards are huge and very complicated to make and they are also using a custom version of the process node (8N NVIDIA) for better performance for the GPUs (rumoured to at around 10% compared to normal 8nm).
They're using a tweaked version of the 8LPP node, not an entirely custom one, so it's not a significant factor for the shortage of 3080/3090 cards. That comes down to a collection of other aspects: (1) the dies from the wafers only go into 3 products (3080, 3090, A6000) and none of them are using either a full die nor a very cut-down one; (2) there's only one supplier of GDDR6X; (3) demand is exceptionally high because the price looks so much better compared to Turing.

The aspect of die size is less of a problem than you'd think: the GA102 is 17% smaller than the TU102, and while that might not sound very much, it results in around 4 to 5% better area yield. Given that there's never been any significant shortage of TU102 chips, the size alone can't be considered to be a big problem.

The GA104 is 392 square mm: half the size of the TU102 and 28% smaller than the TU104 (2070 Super) and 12% smaller than the TU106 (2070). So yields for this chip are going to be significantly better all round.
 
Even then...tbh, if AMD released a slightly respun and updated Navi GPU to cover the sub $100 market that would be nice. Entry level cards like the 1030 and RX550 are imho severely lacking. That would be something like the 5500XT though.

Anything above that looks like it‘s being obsoleted by next gen.
The problem is that people looking for a sub-$100 card (for the most part) aren't looking to game with it so they often just use an IGP. If they are looking to game with it, they're much better off buying used. There are so many great cards out there that you can get for sub-$100 (performance-wise) that no new card could possibly compete.

There's a channel on YouTube from jolly ol' England that shows how to game well at 1080p for next to nothing. It's called "RandomGaminginHD".
And then from down-under, there's Tech YES City which does essentially the same thing.

If you want to game with a sub-$100 card, it would be insane to buy a new discrete card.
 
AMD need to force a $50 MSRP drop of this and it'll be a bargain. If Big Navi can do that then competition has returned. Great looking card, will be interested in seeing what kind of sustained clocks it can manage when pushed.

Bodes well for the inevitable 3060 too. It'll surely be at least as fast as the 2070 Super, and I would hope it won't be more than $350.
The problem with that idea is the fact that the only way that AMD could get nVidia to do a price drop is if they release a card that is a much better buy than the nVidia card. That would make the RTX 3070 not a bargain. This happens both ways, like when AMD released the RX 5500 XT it was pointless compared to the GTX 1660 but when AMD released the RX 5600 XT, it made the GTX 1660 Ti pointless.

It's like a phone price war between Samsung and Motorola. Sure, some (ignorant) people have a strong brand preference and will buy a brand just because they've never had the other one. People who have been around the block a few times know to buy according to spec, not brand. The experience from both phones is almost identical.

That's really the best metaphor when comparing CPUs and GPUs.
 
Last edited:
They're using a tweaked version of the 8LPP node, not an entirely custom one, so it's not a significant factor for the shortage of 3080/3090 cards. That comes down to a collection of other aspects: (1) the dies from the wafers only go into 3 products (3080, 3090, A6000) and none of them are using either a full die nor a very cut-down one; (2) there's only one supplier of GDDR6X; (3) demand is exceptionally high because the price looks so much better compared to Turing.

The aspect of die size is less of a problem than you'd think: the GA102 is 17% smaller than the TU102, and while that might not sound very much, it results in around 4 to 5% better area yield. Given that there's never been any significant shortage of TU102 chips, the size alone can't be considered to be a big problem.

The GA104 is 392 square mm: half the size of the TU102 and 28% smaller than the TU104 (2070 Super) and 12% smaller than the TU106 (2070). So yields for this chip are going to be significantly better all round.
Samsung is fairly known to have yield issues and they've had problems getting customers because of this. I'm not saying that the yields are bad, just that not where both samsung and Nvidia were hoping it to be right now.
 
Samsung is fairly known to have yield issues and they've had problems getting customers because of this. I'm not saying that the yields are bad, just that not where both samsung and Nvidia were hoping it to be right now.
Which is fair point to make. After all, this is probably the very massively monolithic processor that Samsung have had to produce, let alone on their relatively new 10/8LPP node. TSMC have over two decades of experience of working with AMD and Nvidia.
 
I was wondering if that was the case because I never really researched how the controllers work best by factors of 2 vs any other way. That makes sense if that is the reason with the hardware in the way the memory subsystem is set up.
There have been cards, mostly from nvidia, that have split the controller, IE some controllers have 1 chip and some have 2 on the same card. Cards like the 550ti and 660 did this, with their 1 and 2GB 192 bit buses. The problem is to do this some of the chips will only operate at half speed, resulting in the last chunk of VRAM running at half speed.
 
Nvidia switching foundries has never payed off. Be it Samsung or IBM which one was the other? I recall one more during the FX series IIRC
 
I drove to Microcenter at 8:30.

Got on line with 4 people in front of me - waiting for the store to open at 9:00.

The manager gave me a voucher.

The manager handed out more vouchers - plenty of supply.

I got my 3070 within 5 minutes after store opening.

Now: why couldn't the 3090 be that easy? Microcenter only had 3 of them in stock on launch day.

Best Buy had 0 cards on the 3080 and 3090 launch day, but they sold them through their website and had them delivered direct to the home.
 
Looks like the GTX 2080ti "comparatively" just gotten trounced by a seemingly lower-end but newer version GPU. As a blue collar poor boy I was hoping to upgrade in early 2021 as cheaply and smart as possible! I now wonder that once the "3070" cards hit the streets in mass, for what I can buy a used 2080ti for? Was hoping for a about $200 or less? As the Bangladesh boys at the computer show said: "We have come a long way baby and we only take cash." Thoughts?
 
Looks like the GTX 2080ti "comparatively" just gotten trounced by a seemingly lower-end but newer version GPU. As a blue collar poor boy I was hoping to upgrade in early 2021 as cheaply and smart as possible! I now wonder that once the "3070" cards hit the streets in mass, for what I can buy a used 2080ti for? Was hoping for a about $200 or less? As the Bangladesh boys at the computer show said: "We have come a long way baby and we only take cash." Thoughts?

For $200 or less the 980 Ti is the best option or for newer the 1660 Ti/Super but I'm not sure they will be that low in a few months?
 
Yes at a cool $500 suggested retail pricing the "3070" (and nicely holding its very own against the older 2080ti) looks like a real good buy! Wishful thinking however has me still hoping to buy a "used 2080ti" at the computer show from one of the stands for at least $250 cash (no sales tax) sometime by March/April 2021. The Bangladesh boys however noted yesterday (at their stripcenter PC shop) that if I would be willing to fork-over $250 cash for a used 2080ti, that then I would be better off buying a brand new 3070 and be future proofed out of the box! They probably know something I dont about used GPU's coming out of rhe woodwork and especially those surfacing out of the basements from crytocurency dwellers. Still mindblowing however is still the cost in what the 2080ti once demanded versus what it can be had now in only a few months time. Thoughts?
 
Back