Nvidia GeForce RTX 4070 Super Review: Are We Finally Getting Good Value?

$600 was about how much I spent for my GTX 1080 ($650 actually, before taxes) closer to the end of it's low supply rut at the end of 2016, and ideally I do not want to spend more than that for a graphics card. It still irks me that a x070 class card is still running for that price, but I understand that a dollar back then supposedly went nearly 25% further than it does now (US CPI calculator).

For me, based on review benchmarks the Super hits a middle ground I was looking for where the TI cost too much, while the stock did not have the performance I was looking for (and cost too much for what you got).

I would like to build a new computer this year after an 8 year drought (oh jeez, it's been 8 freaking years already), and if I can get a decent deal on a 4070 Super, or at least not marked up, that might be my go to card.
 
Well it's nice to see that a $600 product is now solidly surpassing a $700 product from 3.5 years ago! Well, unless you game in 4K.

Also, not to brag, but I called this result on this card almost exactly. The gain at lower resolutions is higher than the gain at 4K, I was even pretty close on the percentages!

Here's what I said January 8th:
The 4070 Super makes a nice leap forward in terms of performance, likely >22% at resolutions 1440p and below, but without the additional VRAM and the 192 bit bus, 4k titles will still likely suffer more than lower resolutions. This was the case with the 4070 already compared to the 3080. My guess is that the 4K increase is smaller, maybe 15% better than the 4070. It will still be behind the 4070 Ti in performance, but at a better starting price.
 
Last edited:
unless of course AMD reduces prices.
Price cut is coming in February
Thank you for the work and effort, but the review is so upsetting: ANY REVIEW... that reviews GPUs and does not have Call of Duty (thee #1 FPS competitive game) is absolutely worthless review...! Sunday, in chat with ovr thousands of people talking about the new cards RTX coming, so many people wanting to know.. and now Techspot becomes worthless to Gamers and once again catering to single-player games, in which performance doesn't even matter if you are at 58 fps, or 82 fps. Techspot should do an extensive in-game review, to see if frames even matter in single-player games? For the buying public, if you do not have Call of Duty/Battlefield then Techspot gets no shout outs, links, mention, or support. I am sure all the competitive Cyberpunk clans are linking these reviews to their clansmen, chiding them to upgrade so they have enough fluidity to make jumps, or track a foe, etc. What drives GPU sales are games they REQUIRE hardware to compete. IMO, Techspot is doing a disservice to the PC Gamers, being so stuck on Ray Tracing. What is the point in doing a whole section on Ray-Tracing... when the most/best/easiest/monumental way to get performance for you PC system, is to turn off ray tracing..! All of the RTX owners know that nobody actually plays with ray-tracing on. And that turning on RT is something we TRY for a few hours and then go back to performance mode, after the jelly wears off. (So why cater to jelly fish..?) Really saddening that TECHSPOT seems more interested in pseudo-glizzy and is not worried about competitive performance and focused their articles based on "Performance Hits" taken by turning on ray tracing, then actual in-game use. Techspot use to have a large section just for Call of Duty... those charts & graphs were relevant.
Oh no, we will have less 13 year olds in the techspot community, whatever will we do....

your comment embodies every toxic trait a COD player has
 
Last edited:
Thank you for the work and effort, but the review is so upsetting:
ANY REVIEW... that reviews GPUs and does not have Call of Duty (thee #1 FPS competitive game) is absolutely worthless review...!


Sunday, in chat with ovr thousands of people talking about the new cards RTX coming, so many people wanting to know.. and now Techspot becomes worthless to Gamers and once again catering to single-player games, in which performance doesn't even matter if you are at 58 fps, or 82 fps.

Techspot should do an extensive in-game review, to see if frames even matter in single-player games?


For the buying public, if you do not have Call of Duty/Battlefield then Techspot gets no shout outs, links, mention, or support. I am sure all the competitive Cyberpunk clans are linking these reviews to their clansmen, chiding them to upgrade so they have enough fluidity to make jumps, or track a foe, etc.

What drives GPU sales are games they REQUIRE hardware to compete.


IMO, Techspot is doing a disservice to the PC Gamers, being so stuck on Ray Tracing. What is the point in doing a whole section on Ray-Tracing... when the most/best/easiest/monumental way to get performance for you PC system, is to turn off ray tracing..! All of the RTX owners know that nobody actually plays with ray-tracing on. And that turning on RT is something we TRY for a few hours and then go back to performance mode, after the jelly wears off. (So why cater to jelly fish..?)

Really saddening that TECHSPOT seems more interested in pseudo-glizzy and is not worried about competitive performance and focused their articles based on "Performance Hits" taken by turning on ray tracing, then actual in-game use.

Techspot use to have a large section just for Call of Duty... those charts & graphs were relevant.

I can appreciate the desire to see your favorite game appear in the list of tested titles. If it's of any help, this review makes it clear the 4070 Super performs right in between the 4070 and 4070 Ti. If you find any benchmarks that contain both of those older cards, you can then guesstimate the average frames for this card.
 
I'm still not ready to adopt that 12VHPWR connection.

I might also be in the minority here, but I'm seriously considering switching to AMD to have better compatibility with Linux in the future. Valve and Proton are making huge headway in helping to make that are more consumer friendly experience for gamers. The only thing really holding it back is the lack of anti-cheat support. I'd love to see if I can move to Linux once that's in place, but I know I'd have a harder time doing so if I buy another Nvidia card.
 
Apart from the 12VHPWR connector and the price which is still quite expensive, it seems that this new GPU from Nvidia is quite promising and can be considered when building a new PC..
 
I'm still not ready to adopt that 12VHPWR connection.

I might also be in the minority here, but I'm seriously considering switching to AMD to have better compatibility with Linux in the future. Valve and Proton are making huge headway in helping to make that are more consumer friendly experience for gamers. The only thing really holding it back is the lack of anti-cheat support. I'd love to see if I can move to Linux once that's in place, but I know I'd have a harder time doing so if I buy another Nvidia card.
I went specifically bought a 6700xt because of Linux compatibility. It takes about a year to become adequately acquainted with Linux. I'm done with windows and I'm done with nVidia. Linux Mint is a fantastic place to start. I'm of the opinion that linux is at the windows XP level of jankiness but it also doesn't hide anything from you like windows 10 and 11 do so there is a trade off. It's gotten to the point where if you can build a PC you can daily Linux.
 
I know these cards are better than my 3060ti but until the prices drop more I won’t be investing in one of these. It’s a shame as the performance is better than the initial releases but that power connector and the high price for moderate gains over the previous versions just doesn’t do it for me. Not with the value I got/have from my 3060ti anyway. I’d say I’ll wait for the 50 series to reduce prices but I still find myself thinking of an amd card for my next upgrade and I haven’t had one of those since I had an ATI Radeon HD 4050.
 
Poll? I've been following this site for years and have never seen such a poll. But allow me to answer, RT is a joke, and would never be a criterion for choosing any GPU. There isn't a single game worth enabling this that will run well on anything below the 4090.

Real performance and price, the only points that matters to me.
 
Nice review, lots of great info!

I'm still using GTX 1060 6GB. I haven't played much lately (work too much) but I will definitely buy a GPU soon, probably next month and I might buy my first AMD card, just not sure what to get yet.
 
Nice review, lots of great info!

I'm still using GTX 1060 6GB. I haven't played much lately (work too much) but I will definitely buy a GPU soon, probably next month and I might buy my first AMD card, just not sure what to get yet.
There are so many stunning games that were released since 1060.
Unreal 5 and its rivals did a big step forward.
I still can't get used to how amazing Forza Motorsport looks, for example.
New tech in graphics, affordable 2k and somewhat OLED, it is not all great
but it is a good time to experience all or some of that
 
Poll? I've been following this site for years and have never seen such a poll. But allow me to answer, RT is a joke, and would never be a criterion for choosing any GPU. There isn't a single game worth enabling this that will run well on anything below the 4090.

Real performance and price, the only points that matters to me.

Ray tracing does look good on the right hardware. Just that hardware is currently expensive. Can’t argue it doesn’t look better when it’s not impacting framerates. Saying it’s completely unnecessary is like when 3DFX was arguing no one wanted 32bit color.
 
LOL, it's still pathetic value, just less pathetic than the garbage class 4070. If it were 16GB then I would accept it as semi-decent value. Needed to be $529 with just 12GB.

7800XT is looking shaky too at over $500. New pricing should be 7800XT $459, 7700XT $389.
You seem to think we will be returning to the days of sub-$500 top tier cards and sub$250 mid tier. Sadly inflation doesn't work that way. So instead of bemoaning price, ask your boss for a raise and tell them inflation sucks. If they refuse, find a better paying job. Barring either of those options, save your money for one extra month or two for a card that, in all reality, will last you several years at least.
 
I paid $730 for a good quality 3080 12GB card in 2022. This card is only $130 cheaper (assuming MSRP-priced cards are actually available) and performs pretty similarly overall. I’m not exactly getting buyer’s remorse when NVIDIA is basically selling the same class of performance for a pretty modest price cut almost two years later. Oh, and don’t forget they shrunk the bus width and the die size to save manufacturing costs.

Skip the 40 series if you can wait. This isn’t as terrible as the other 40 series cards, but it’s still overpriced for the performance tier, IMO.
 
All I'm saying is, I only paid $650 for my 7900 XT, $50 more than this card for around 10-20% better performance and 20GB of VRAM.

Obviously, that was a short-time offer and well under its list price, but at MSRP even the 4070 Super still feels like a poor compromise to me. Now, if one had the opportunity to catch it on a sale for, say, under $500 ($450 sweet spot!) then yeah, I'd say go get it.
 
Only in some cases, Not so much at 1440 and bus width becomes a problem at 4k. 192bit has no business being on anything that costs $600
Sorry, we can't agree here. A standard 4070 does well at 4k, so the "Super" variant will also do well.

And when I say "does well", I mean 4k with reasonable settings, customized and tailored for personal taste, NOT defaulted to "ULTRA MEGA MAXIMUM!!" settings.
 
While true, the faster VRAM more than offsets that memory bus reduction.
It does offset it to an extent depending upon resolution as yRaz pointed out. However, my point in stating this was that the 4070 should be cheaper to produce between the die shrink and the cut down bus. Also, cooling solutions for the 4070 should be less expensive given the reduced power consumption.

Basically, we’re getting familiar performance at a mild discount which isn’t all that exciting. I’d love to see what NVIDIA’s actual margins are on this product.
 
Back