Nvidia GeForce RTX 3070 Review: The New $500 King

neeyik

Posts: 1,449   +1,604
Staff member
I took this from another site who uses I7-10700k. So we can compare for 2k: AC:Odyssey ultra- i7:78fps, your 3950x: 66fps, F1 2020 ultra- i7:170fps, 3950x:151, Horizon Zero Dawn ultimate- i7:105fps, 3950x:96fps, Wolfenstein Youngblood mein leiben! setting- i7:236fps and again your 3950x:199fps. It`s true in 4k the differences shrink, but this is suppose to be a 2k card. I hope the I9 benches come next, just to be fair.
Did this other site use exactly the same tests in those games that Steve did? The exact same locations, the exact same environments, action taking place, and so on. Without that information, any comparison between test setups and subsequent results is a fairly empty exercise.

Yes really, which is why we will see OEM's with more memory and higher boost clocks I imagine. You don't buy now for just today's games but for future games that push it harder. Gaming at 4K everything helps and I'm not talking about slower gpu's.
16 GB versions are definitely possible - Samsung and Micron both offer 14 Gbps, 2 GB GDDR6 modules (SK Hynix doesn't), and it wouldn't require any changes to the PCB layout (just the VRMs). They're not in very high demand, though, so there may not be sufficient volume of them yet, to warrant AIB vendors looking to offer a 16 GB model just yet.
 

Shadowboxer

Posts: 974   +576
I bought a 2080 in April for the same money that this 3070 costs. Normally I’d be annoyed but or 6 of those last 7 months I’ve been housebound and needed that GPU. Still, def have a mild amount of remorse!

People keep mentioning the AMD competition. I do hope it’s going to be good but we would need to see an enormous jump from them to compete here, so much so that I don’t see it happening. Before the 30xx series launched the 2070 super pretty much outperformed anything AMD could sell without its killer features like DLSS and ray tracing.

And that’s the thing isn’t it, DLSS in particular seems to be getting quite a lot of adoption (I certainly use it quite often) and AMD have no answer for that so far. Even if RX6xxx can outperform a 3070 in normal rendering it might lose out when you turn DLSS on.

 
  • Like
Reactions: Reehahs

Evernessince

Posts: 5,461   +6,132
"Oh yea, did we mention this card will run you $500 instead of $1,200?"

Yes if you ignore than the $1,200 card was horrid value. Not really a plus to be compared to the worst. Recent reviews from Techspot have not been holding companies to account like I'd expect. I'm not just talking about Nvidia here either, I expect the price increases AMD has recently done to factor into any review.

Overall a great card but this is an actual paper launch. You are naive if you think they will have sufficient volume two days from now.
 
Last edited:

Stoly

Posts: 51   +23
Did this other site use exactly the same tests in those games that Steve did? The exact same locations, the exact same environments, action taking place, and so on. Without that information, any comparison between test setups and subsequent results is a fairly empty exercise.


16 GB versions are definitely possible - Samsung and Micron both offer 14 Gbps, 2 GB GDDR6 modules (SK Hynix doesn't), and it wouldn't require any changes to the PCB layout (just the VRMs). They're not in very high demand, though, so there may not be sufficient volume of them yet, to warrant AIB vendors looking to offer a 16 GB model just yet.
While it's technically feasible and even cheap to release a 16GB card (samsung has 16Gbps/16Gb chips for both more memory and higher bandwidth), it wouldn't look good to have a 16gb RTX3070 while being stuck with a 12gb RTX3080
 
  • Like
Reactions: Avro Arrow

Avro Arrow

Posts: 354   +375
This is a great value from nVidia (not something that I've been able to say for a loooong time) and it will sell like crazy but there's a few caveats. Those caveats are:

  1. As long as there are actual cards to sell
  2. As long as the sneaker-bot army is successfully repelled
  3. As long as RDNA2 doesn't cause the RTX 3070 to be stillborn

As long as none of those are issues, nVidia has a real winner on its hands.
 
Last edited:

Avro Arrow

Posts: 354   +375
Boy it would really be a winner with a little more vram for 4K.
You'll never see that happen. I'm sure that nVidia knows all too well that if they give it enough VRAM for 4K gaming, it will cannibalise RTX 3080 sales. This is how nVidia forces market segmentation when its higher-end offerings when pretty much all of the cards entering the market are insanely overpowered. The RTX 2080 Ti is a pretty damn good 4K gaming card and so nVidia has to truncate the RTX 3070's range of abilities to make sure that its 4K potential is limited and people only use it as a 1440p card.

It's a bit weird though because the RTX 3070 is pretty OP as a 1440p card because cards like the RTX 2070 Super and RX 5700 XT are exceptionally good at 1440p. I guess if you wanted to play something like CS:GO at 4K the RTX 3070 would be a good choice for uber-high frame rates.
 
  • Like
Reactions: Charles Olson

neeyik

Posts: 1,449   +1,604
Staff member
While it's technically feasible and even cheap to release a 16GB card (samsung has 16Gbps/16Gb chips for both more memory and higher bandwidth), it wouldn't look good to have a 16gb RTX3070 while being stuck with a 12gb RTX3080
Indeed it wouldn't, although it does offer scope for Nvidia to do a 'Super' update in 6 months. Especially if Micron have 16 Gb GDD6X by then, as it would allow for a 3070/3080/3090 Super line-up, all with doubled RAM.
 

Edster

Posts: 69   +51
As much as this looks like great value, possible supply issues and the fact RDNA2/Big Navi is coming out, there is very little reason to try and get this card at launch.

In some ways, I think this is made to be 1440P king, that can do 4K. Whereas if you want pure 4K experience, you are encouraged to step up to the 3080. Or on another perspective, a versatile 4K card that have low power draw (it means I can l keep my old PSU).

But this is why you should wait for AMD, as if they have a good reason to release best value cards to eat into the market, and they might well release a card at that speed with more VRAM at same price.
 
  • Like
Reactions: Charles Olson

Avro Arrow

Posts: 354   +375
I must say while the RTX 2000 series had its naysayers and controversies they are still excellent performers 2 years later.
The performance wasn't the issue, price was. I could say that the R9 Fury from 2015 is an excellent performer over 5 years later because it can still run everything maxed out at 1080p like it could when it was new. It's not like cards stop being good as soon as the next generation comes out. However, as we can see here, it IS like cards can be shown to be horrible values as soon as the next generation comes out.
 
  • Like
Reactions: Charles Olson

Rayneofpayne

Posts: 245   +232
AMD need to force a $50 MSRP drop of this and it'll be a bargain. If Big Navi can do that then competition has returned. Great looking card, will be interested in seeing what kind of sustained clocks it can manage when pushed.

Bodes well for the inevitable 3060 too. It'll surely be at least as fast as the 2070 Super, and I would hope it won't be more than $350.
$400 and it's as fast as a 2080 not the S except in RT performance.
 

KaitouX

Posts: 7   +6
I must say while the RTX 2000 series had its naysayers and controversies they are still excellent performers 2 years later.
The issue with them was the price, most RTX 20x0 GPUs should have costed at least $100 less, and the 2080Ti should have been at least $400 less(from the street price of $1200).
 
  • Like
Reactions: Charles Olson

fps4ever

Posts: 555   +635
You'll never see that happen. I'm sure that nVidia knows all too well that if they give it enough VRAM for 4K gaming, it will cannibalise RTX 3080 sales. This is how nVidia forces market segmentation when its higher-end offerings when pretty much all of the cards entering the market are insanely overpowered. The RTX 2080 Ti is a pretty damn good 4K gaming card and so nVidia has to truncate the RTX 3070's range of abilities to make sure that its 4K potential is limited and people only use it as a 1440p card.

It's a bit weird though because the RTX 3070 is pretty OP as a 1440p card because cards like the RTX 2070 Super and RX 5700 XT are exceptionally good at 1440p. I guess if you wanted to play something like CS:GO at 4K the RTX 3070 would be a good choice for uber-high frame rates.
I'd take that bet with OEM's involved. The 2080 Ti is 11GB's and at a 256bit memory bus adding 4GB to go to 12GB for the 3070 is not unreasonable. The fact that Nvidia hampered the 3080 with only 10GB of albeit better memory and a 320bit bus than the 3070 suggests it was an odd release, especially with the cancellation rumors going around. There are also other differences than just the amount of memory between the two cards.
 
  • Like
Reactions: Lounds

brucek

Posts: 574   +683
TechSpot Elite
I've previously wondered aloud why companies pull juvenile stunts like scheduling releases one day before another company's.

Then I see a review like this one, with a slew full of graphs containing only AMD's last generation offerings, and prematurely awarding the "$499 King" title the day before the competitor is set to enter the ring.

Now I don't imagine many regular readers will be fooled, and I'm sure they'll be AMD reviews in due time. That said, is there really an audience of people who will view this review in isolation, either today or in the coming weeks, and act on it alone? Is that what Nvidia was going for and does the presence of an article that will remain as is like this one mean their strategy is actually at least partially effective?
 
  • Like
Reactions: Colonel Blimp

Vulcanproject

Posts: 1,271   +2,116
I'd take that bet with OEM's involved. The 2080 Ti is 11GB's and at a 256bit memory bus adding 4GB to go to 12GB for the 3070 is not unreasonable. The fact that Nvidia hampered the 3080 with only 10GB of albeit better memory and a 320bit bus than the 3070 suggests it was an odd release, especially with the cancellation rumors going around. There are also other differences than just the amount of memory between the two cards.
You're limited by the memory controller address and current size of the memory that is manufactured.

You can't really do 12GB on a 256 bit bus card unless you start splitting up the controllers in a way that will probably just slow the GPU down anyway. Defeating the point. I don't think 8GB is that bad for this card, it's really aimed at 1440p anyway I feel.

The real concern here for me is what Nvidia do for the 3060. It seems inconceivable to me that it could be anything less than this same 256 bit bus and 8GB of memory. They can't go to a 192 bit bus again can they? Surely only on sub $300 parts.

However if it is less then that it is Nvidia market segmentation. 6GB of memory would not really be good on a new card over $300. I don't think they would do it.
 

KaitouX

Posts: 7   +6
The real concern here for me is what Nvidia do for the 3060. It seems inconceivable to me that it could be anything less than this same 256 bit bus and 8GB of memory. They can't go to a 192 bit bus again can they? Surely only on sub $300 parts.

However if it is less then that it is Nvidia market segmentation. 6GB of memory would not really be good on a new card over $300. I don't think they would do it.
Most rumors/leaks about the GA106 imply 192 bit bus, the 3060Ti(or any name Nvidia ends up using) would use the GA104 making it a cut down 3070 with likely the same memory configuration, but the 3060/3050Ti are probably going to use the GA106 meaning most likely 6GB on a 192 bit bus. They did it with the 2060, wouldn't doubt they would do it again.
That's why I personally hope that at the very least the card using the GA106 will be under $300.
 

nnguy2

Posts: 160   +296
Is it me or is Nvidia REALLY REALLY trying to steal Big Navi's thunder? Hope they know something we don't and that's why they're launch cards with practically zero stock.