Nvidia GeForce RTX 4070 Super Review: Are We Finally Getting Good Value?

Sorry, we can't agree here. A standard 4070 does well at 4k, so the "Super" variant will also do well.

And when I say "does well", I mean 4k with reasonable settings, customized and tailored for personal taste, NOT defaulted to "ULTRA MEGA MAXIMUM!!" settings.
At these price points I don't care what the cards name is. The 3070 had a 256 bit bus and a larger die size. I'm also not saying you have to max out the game, but if you want high resolution textures at 4k then 192bit is going to be a bottleneck.

That's really one of the reasons people are so angry about this generation of cards, not as many people care about high end gimmicks as nVidia marketing material would have us believe. There is now a large group of PC gamers who want an immersive experience using their 4k TV to play games and something like a 192 bit bus becomes an issue there. It's one thing if you're paying $300 for a 7600xt 16GB but not $600 for a 12GB card. There are more households with 4k TVs than their are with high refreshrate monitors. While I have both, I find my self laying in bed playing single player games on my 65" more than I do sitting at my desk using my 27" 1440p.

The 4070S(yes, im calling it the 4070S now), as I see it, is mostly just 1080P card meant for competitive FPS games. The 4070TiS will be a powerhouse.

That 192bit bus would be fine but once you start messing with shaders you start to see a real impact with frame stuttering and 1%lows which is entirely unacceptable on a card that is marked as a 70s class card. Which, we all know it's not. They shrank the die size and the memory bus from the 3070. The 4070s is truer to a 70s class card but I still can't help but be angry about it.
 
Sorry, we can't agree here. A standard 4070 does well at 4k, so the "Super" variant will also do well.

And when I say "does well", I mean 4k with reasonable settings, customized and tailored for personal taste, NOT defaulted to "ULTRA MEGA MAXIMUM!!" settings.

No one spends $600 to play games at medium settings. They max out the settings for their resolution and at even at 1440p the 4070 super is mid at best.
 
Your RT game selection is very questionable. Resident Evil? Cyberpunk at medium settings? Spiderman at High (6 distance)?
 
I can appreciate the desire to see your favorite game appear in the list of tested titles. If it's of any help, this review makes it clear the 4070 Super performs right in between the 4070 and 4070 Ti. If you find any benchmarks that contain both of those older cards, you can then guesstimate the average frames for this card.
Also benchmarking online games is more difficult to reproduce exactly the same conditions on each run and when you are benchmarking 10 or more GPUs, thats tricky to be able to use the data reliably to compare them.
 

Nvidia's GeForce RTX 4070 Super is the first refreshed 'Super' GPU to be released. Based on specs, it should be ~20% faster than the 4070, at the same price, which sounds like a solid deal.

Read the full article here.

Good value? Nope!!! Not here in Australia, where the RTX4070 is still going for around $1000 give or take. And the 4070ti's are around $1300. Who knows what stupid price these Supers are going to go for.
Bought myself a second hand 3070 for just over 12 months ago for $500. It was just on 12 months old itself, and at the time the model I got was still going for around $900. Prices off GPU's has priced so many people out of the game due to their cost to buy new. Yes they've come down since the crazy mining days, but still not enough.
Doesn't make sense to me when in the US they're half the Aussie pricing. The Aussie dollar isn't THAT bad. I very much doubt I'll ever be in the financial position again to buy a new GPU. The pricing has priced me out of the new GPU market.
I could go buy the latest X-Box for the price of a 4060ti, and that'll game just as well at 1080p.
Rant Over.
 
Poll? I've been following this site for years and have never seen such a poll. But allow me to answer, RT is a joke, and would never be a criterion for choosing any GPU. There isn't a single game worth enabling this that will run well on anything below the 4090.

Real performance and price, the only points that matters to me.
I don't get it either. RT completely tanks your FPS for a few new shiny bits. Some day it will work better but that day isn't today. I think basing GPU reviews entirely around running games with RT "on" is a mistake.
 
At these price points I don't care what the cards name is. The 3070 had a 256 bit bus and a larger die size. I'm also not saying you have to max out the game, but if you want high resolution textures at 4k then 192bit is going to be a bottleneck.

That's really one of the reasons people are so angry about this generation of cards, not as many people care about high end gimmicks as nVidia marketing material would have us believe. There is now a large group of PC gamers who want an immersive experience using their 4k TV to play games and something like a 192 bit bus becomes an issue there. It's one thing if you're paying $300 for a 7600xt 16GB but not $600 for a 12GB card. There are more households with 4k TVs than their are with high refreshrate monitors. While I have both, I find my self laying in bed playing single player games on my 65" more than I do sitting at my desk using my 27" 1440p.

The 4070S(yes, im calling it the 4070S now), as I see it, is mostly just 1080P card meant for competitive FPS games. The 4070TiS will be a powerhouse.

That 192bit bus would be fine but once you start messing with shaders you start to see a real impact with frame stuttering and 1%lows which is entirely unacceptable on a card that is marked as a 70s class card. Which, we all know it's not. They shrank the die size and the memory bus from the 3070. The 4070s is truer to a 70s class card but I still can't help but be angry about it.
This is the issue I have (the performance being off at 4K due to the memory bus). Of course it will drop off anyway because the requirements for 4K @120hz etc are very high. New TVs, you would hope you could turn the settings on a game down low and get high fps without having to buy an £800 card in 2024. It’s not just inflation. It’s the cost of the new tech going into the tensor cores etc and the smaller die size etc to improve energy efficiency. That’s fine - in 2 years time you would “expect” similar performance that you get from a 4070 super from the 5060 and for it to be cheaper than £600. This generation seems to be “if you want good performance without compromises then buy a 4090”. Everything else has caveats.

Advertising cards as being for a specific resolution is bad too. Not seen that before. Now they can cap performance by using narrower bus widths etc by putting “designed for 1440p” on the box. That never happened before. You could always throw your card at new kit and maybe pair it with a better CPU etc but these are forcing you to buy a £1000 card (4080 super) if you want to enjoy 4K or at least pointing at the “1440p” label if anyone complains.
 
Good value? Nope!!! Not here in Australia, where the RTX4070 is still going for around $1000 give or take. And the 4070ti's are around $1300. Who knows what stupid price these Supers are going to go for.
Bought myself a second hand 3070 for just over 12 months ago for $500. It was just on 12 months old itself, and at the time the model I got was still going for around $900. Prices off GPU's has priced so many people out of the game due to their cost to buy new. Yes they've come down since the crazy mining days, but still not enough.
Doesn't make sense to me when in the US they're half the Aussie pricing. The Aussie dollar isn't THAT bad. I very much doubt I'll ever be in the financial position again to buy a new GPU. The pricing has priced me out of the new GPU market.
I could go buy the latest X-Box for the price of a 4060ti, and that'll game just as well at 1080p.
Rant Over.
Yeah, well you have abbos and the graphics cards don't even want to be around them. Those 4070s have to pay for their gasoline rabbit somehow.
 
Another great review, Steve Walton!

I just wish you had one 3080 12GB model in your arsenal - either the RTX 3080TI or the rare RTX 3080 12GB model for comparisons. I often wonder how much difference that extra 2GB of Vram and moderate bump in cores would make over the stock 3080 vs 4070/4080 models, especially at higher resolutions?

Full disclosure: I own an MSI 3080 12GB OC model ;)
 
Another great review, Steve Walton!

I just wish you had one 3080 12GB model in your arsenal - either the RTX 3080TI or the rare RTX 3080 12GB model for comparisons. I often wonder how much difference that extra 2GB of Vram would make vs 4070/4080 models, especially at higher resolutions?

Full disclosure: I own an MSI 3080 12GB OC model ;)
I also have an MSI 12GB 3080. If you look at the deep dive on Hogwarts Legacy that Steve did last year, you’ll see the 4070 Ti vs all flavors of the 3080.
(https://www.techspot.com/review/2627-hogwarts-legacy-benchmark/)

It’s only one game, but the 1440p ultra shows the 12GB 3080/3080Ti significantly narrowing the gap with the 4070Ti. At 4K ultra the 4070 Ti takes a bigger lead again, but my theory is that the wider bus on the 3080 helps keep the GPU from getting starved better than the 4070 Ti when both start to hit the VRAM limit. Once at 4K ultra, the 4070 Ti regains a larger lead, probably because all the cards are choking with “only” 12 GB VRAM and the wider bus is no longer helping the 3080 keep up.

If you extrapolate a bit, I think the 4070 Super is pretty comparable to the 12GB versions of the 3080 (probably a bit faster in some titles, but not by much).

I personally still think we’ll see cases where the wider bus makes the 3080 12GB match or beat the newer cards when close to the VRAM limit. We just haven’t quite gotten there yet, other than perhaps the 1440p ultra results in Hogwarts legacy.
 
I also have an MSI 12GB 3080. If you look at the deep dive on Hogwarts Legacy that Steve did last year, you’ll see the 4070 Ti vs all flavors of the 3080.
(https://www.techspot.com/review/2627-hogwarts-legacy-benchmark/)

It’s only one game, but the 1440p ultra shows the 12GB 3080/3080Ti significantly narrowing the gap with the 4070Ti. At 4K ultra the 4070 Ti takes a bigger lead again, but my theory is that the wider bus on the 3080 helps keep the GPU from getting starved better than the 4070 Ti when both start to hit the VRAM limit. Once at 4K ultra, the 4070 Ti regains a larger lead, probably because all the cards are choking with “only” 12 GB VRAM and the wider bus is no longer helping the 3080 keep up.

If you extrapolate a bit, I think the 4070 Super is pretty comparable to the 12GB versions of the 3080 (probably a bit faster in some titles, but not by much).

I personally still think we’ll see cases where the wider bus makes the 3080 12GB match or beat the newer cards when close to the VRAM limit. We just haven’t quite gotten there yet, other than perhaps the 1440p ultra results in Hogwarts legacy.
"I think the 4070 Super is pretty comparable to the 12GB versions of the 3080 (probably a bit faster in some titles, but not by much)."

Agreed. The 3080 12GB version is still a worthy contender against the 4xxx series at 1440p.
I also squeezed a nice extra bit of performance out my already factory OC'd card - nearly 7%. Every bit helps and my temps are still great!

EDIT: Funny how half way through the review you linked to Hogwart's Legacy, the 3080 12GB non-Ti disappeared from the benchmarks! The biggest improvement I saw between the 3080 10GB vs 12GB models were at 4K RT Ultra. While both were lame, the 12GB model delivered a barely playable 25fps, while the 10GB was at 14fps.

I wonder if this occurred right after TS bad-mouthed the 12GB version as a cash-grab and Nvidia asked for their sample back!?
 
Last edited:
At these price points I don't care what the cards name is. The 3070 had a 256 bit bus and a larger die size. I'm also not saying you have to max out the game, but if you want high resolution textures at 4k then 192bit is going to be a bottleneck.

That's really one of the reasons people are so angry about this generation of cards, not as many people care about high end gimmicks as nVidia marketing material would have us believe. There is now a large group of PC gamers who want an immersive experience using their 4k TV to play games and something like a 192 bit bus becomes an issue there. It's one thing if you're paying $300 for a 7600xt 16GB but not $600 for a 12GB card. There are more households with 4k TVs than their are with high refreshrate monitors. While I have both, I find my self laying in bed playing single player games on my 65" more than I do sitting at my desk using my 27" 1440p.

The 4070S(yes, im calling it the 4070S now), as I see it, is mostly just 1080P card meant for competitive FPS games. The 4070TiS will be a powerhouse.

That 192bit bus would be fine but once you start messing with shaders you start to see a real impact with frame stuttering and 1%lows which is entirely unacceptable on a card that is marked as a 70s class card. Which, we all know it's not. They shrank the die size and the memory bus from the 3070. The 4070s is truer to a 70s class card but I still can't help but be angry about it.
Context is important. When I say that any version of the RTX series can run games at 4k and at playable frame-rates, I meant after customizing and tweaking. I have made an RTX2060 run many games at 4k very well. Some(but not all) setting had to be turn down or off.

So to say any of the current cards are not worth the money is a bit of a stretch. $600 for a card that can run 4k with most of the pretty turned up or maxed out? I say Yes please!
 
Context is important. When I say that any version of the RTX series can run games at 4k and at playable frame-rates, I meant after customizing and tweaking. I have made an RTX2060 run many games at 4k very well. Some(but not all) setting had to be turn down or off.

So to say any of the current cards are not worth the money is a bit of a stretch. $600 for a card that can run 4k with most of the pretty turned up or maxed out? I say Yes please!
Well if that's the case then you should be looking at AMD cards.
 
However, my point in stating this was that the 4070 should be cheaper to produce between the die shrink and the cut down bus
But that isn't logical. Different models are mostly binned dies from the same wafer set. So the costs are mostly identical.
 
Well if that's the case then you should be looking at AMD cards.
No thanks. While AMD has been improving the raytracing performance, they're still lagging behind. Also, I'm not a fan of how little in the way of controls there are in the Radeon control panel.
 
No thanks. While AMD has been improving the raytracing performance, they're still lagging behind. Also, I'm not a fan of how little in the way of controls there are in the Radeon control panel.
I don't know if you're using the Radeon control panel from the ATi days, but it's nice better or worse than nVidia. But what happened to not caring about all the settings? Neither the 4070 or the 7800xt are proper RT cards. Anyone who says they're buying a 4070(or 4070S, for that matter) for RT performance are fooling themselves.
 
I don't know if you're using the Radeon control panel from the ATi days, but it's nice better or worse than nVidia. But what happened to not caring about all the settings? Neither the 4070 or the 7800xt are proper RT cards. Anyone who says they're buying a 4070(or 4070S, for that matter) for RT performance are fooling themselves.
Nope, talking about recent drivers. The current control panel is lacking in detailed feature controls.
 
But that isn't logical. Different models are mostly binned dies from the same wafer set. So the costs are mostly identical.
My comparison was between the last gen 3080 and the 4070. These are two completely different architectures and dies. They are not simply binned parts. Plus, they have different bus widths, which will affect PCB routing and the memory topology. For example:

4070 specs: https://www.techpowerup.com/gpu-specs/geforce-rtx-4070.c3924

3080 specs:

3080 12GB specs:

The 4070 has a 294 mm2 area, the 3080 is 624 mm2, and the 3080 12GB uses a 628 mm2 die. That means that NVidia can fit a lot more 4070 dies on a single wafer, thus reducing costs.
 
My comparison was between the last gen 3080 and the 4070.
Ok, that's fair, but it doesn't really pan out any differently.
These are two completely different architectures and dies.
TSMC's wafer fabs are expensive to operate regardless of the process/die you build upon.
They are not simply binned parts.
Sure they are. Regardless of whether you are talking about two separate wafer lines or two products from the same wafer line, they're all binned.
Plus, they have different bus widths, which will affect PCB routing and the memory topology.
I'm not arguing that point because it's factual. I'm saying it doesn't matter. The bus bit width is less important than total data bandwidth. If the 4070's 192bit bus can perform as good or better than the 3080's 256bit bus, then the bus bit width is irrelevant.

Now if the two of you are saying that the 4070 is gimped by the lesser memory bus, then yes I'll agree that is true.
 
Was the decision to use narrower bus widths made because it was cheaper and would make more profit? Will the technology used affect those cards in 2-3 years time when textures and tech require more memory bandwidth than these cards supply despite their memory speed etc because of the narrower bus width and do those owners need to replace their cards earlier if thats the case? Who knows in the theoretical fantasy world of gpu manufacturing and shareholder appeasement. I'm getting my next card from Ngiggidy.
 
Ok, that's fair, but it doesn't really pan out any differently.

TSMC's wafer fabs are expensive to operate regardless of the process/die you build upon.

Sure they are. Regardless of whether you are talking about two separate wafer lines or two products from the same wafer line, they're all binned.

I'm not arguing that point because it's factual. I'm saying it doesn't matter. The bus bit width is less important than total data bandwidth. If the 4070's 192bit bus can perform as good or better than the 3080's 256bit bus, then the bus bit width is irrelevant.

Now if the two of you are saying that the 4070 is gimped by the lesser memory bus, then yes I'll agree that is true.
You’re confused by what binning is. Binning refers to taking identical silicon and segregating the manufactured dies based on the occurrence of defects or physical properties (e.g., maximum clock speed, disabling defective cores, etc.). That occurs within an identical architecture and process node. I am comparing the 4070 against the 3080. The 4070 is from the Ada Lovelace architecture and is distinctly different from the Ampere architecture upon which the 30 series was built. You don’t get a 4070 from a binned 3080 or vice-versa. You may get a 4070 from a 4070 Ti die through binning, however. Binning is what gives us the i9 vs i7 in the same generation; not across generations with different architectures.

To your comment about TSMC, it is true that you will pay a certain amount per wafer on a given process node. But a smaller die means you get more chips per wafer, which means the cost per chip goes down. Larger chips are more expensive.

As for the data bus, the reason any manufacturer uses a narrower data bus is because there is a cost savings involved as I mentioned in my last post. In the case of the 4070, they compensated for the narrower bus by running a faster memory clock, which in many scenarios is OK, though it’s likely to gimp those cards a bit at higher resolutions. The point, though, is that they did it as a cost reduction, which means the 4070 should be cheaper to produce.

Again, all of this is to say that Nvidia cost-reduced their lineup via architectural changes while increasing prices within the same tier (3070 vs 4070, 3080 vs 4080). That they reduced their input costs and basically changed what it means to be a xx60 or xx70 tier card while charging more is what has so many of us irritated.
 
Back