GeForce RTX 4070 vs. RTX 2070: Worthy Upgrade or Not?

If 12GB VRAM is going to be a problem in 1-2 years, then game dev's really have to wake up, because 90-95% of PC gamers are on 12GB or less.

PS5 and XSX have 16GB RAM total for OS, Game and Graphics yet PC gamers need 32GB RAM and 16-24GB VRAM? Something is not right.

Looking forward to see PS5 Pro specs "soon"

I also find it funny, that in the majority of games I tested, textures on medium looks identical to high/ultra. Low usually lowers the quality slightly, but medium/high and ultra looks the same most of the time.

Lets see if Nvidia's Neural Texture Compression is going to be in games. Or we will need 64GB VRAM by 2026.

Yet both AMD and Nvidia put 8GB still on 7600 and 4060 series. Well I guess most PC gamers play eSport titles or older games anyway.
 
Last edited:
I must admit, I'm still limping along on a GTX1060 6Gb waiting patiently for something that looks worth buying, but the way the graphics market is going, this 1060 will be around for a while still. I know I'll not get anywhere the same value for the price on any of the current, or even last gen cards. It's a very sad state of affairs.

Let's see what comes in May/June.
 
The problem of the test is the CPU.
If someone has a bought an RTX2070 several years ago, then probably do not buy a top-notch CPU then and now.
And if someone buy a 7800X3D probably do not buy an RTX 4070 but more probably an 4080 or 4090.

This thing would be more interesting if you test a medicore CPU like Ryzen 3800 or 5700, i5-10400 or -11400.
 
“We also fear that 3070 owners will go from being shafted by 8GB of VRAM to shafted by 12GB on an even more expensive product.”

Well said!

Except that 3070 8GB beats 6700XT 12GB in 99% of games, even at 4K. Which these GPUs can't handle anyway.

Actually 3070 beats 6700XT more at 4K, than 1440p - https://www.techpowerup.com/review/nvidia-geforce-rtx-4070-founders-edition/32.html

And they both launched at 480-500 dollars.

+ DLSS is superior to FSR for when the GPUs become too weak, eventually.

Just because a few rushed console ports use more than 8GB in 1440p, does not mean 8GB cards are doomed. They are here to stay as more than 85% of PC gamers use 8GB or less (Steam HW Survey).

I would not be surprised if AMD actually paid devs to compress the textures less in these games. TLOU especially, because the textures don't even look that good in this game.

AMD was caught redhanded back in the Shadow of Mordor days, with the ultra texture pack. Looked exactly the same; Yet VRAM usage doubled when you installed it. AMD was sponsor on the title. Dev's did not want to comment on it - LMAO...

 
Last edited:
I must admit, I'm still limping along on a GTX1060 6Gb waiting patiently for something that looks worth buying, but the way the graphics market is going, this 1060 will be around for a while still. I know I'll not get anywhere the same value for the price on any of the current, or even last gen cards. It's a very sad state of affairs.

Let's see what comes in May/June.

Yeah, me too. I am on a Palit GTX 1080 still and it's perfect for my needs. I game on 3440X1440 on it just fine as I play older games.

I seem to have read somewhere that something like 55% of the Steam User Base is still using GTX 1XXX cards so yeah, we are in good company.

They make it sound like all these new cards are such compelling buys but they forget that for someone who plays old games at around 100 FPS these new GPUs are completely unnecessary.

Will probably only upgrade if my 1080 goes out of order for some reason which rather unlikely seeing as how it's undervolted and cared for.
 
Yeah, me too. I am on a Palit GTX 1080 still and it's perfect for my needs. I game on 3440X1440 on it just fine as I play older games.

I seem to have read somewhere that something like 55% of the Steam User Base is still using GTX 1XXX cards so yeah, we are in good company.

They make it sound like all these new cards are such compelling buys but they forget that for someone who plays old games at around 100 FPS these new GPUs are completely unnecessary.

Will probably only upgrade if my 1080 goes out of order for some reason which rather unlikely seeing as how it's undervolted and cared for.

Brand new GPUs are not really meant for people who play old games. Ofcourse old high-end GPUs will play old games just fine.

Also, a feature like DLDSR can transform your old games, or RT mods, this is easily possible with high fps in older games.

Modding older games can transform them. Just like downsampling from much higher res.
 
If 12GB VRAM is going to be a problem in 1-2 years, then game dev's really have to wake up, because 90-95% of PC gamers are on 12GB or less.

PS5 and XSX have 16GB RAM total for OS, Game and Graphics yet PC gamers need 32GB RAM and 16-24GB VRAM? Something is not right.

Looking forward to see PS5 Pro specs "soon"

I also find it funny, that in the majority of games I tested, textures on medium looks identical to high/ultra. Low usually lowers the quality slightly, but medium/high and ultra looks the same most of the time.

Lets see if Nvidia's Neural Texture Compression is going to be in games. Or we will need 64GB VRAM by 2026.

Yet both AMD and Nvidia put 8GB still on 7600 and 4060 series. Well I guess most PC gamers play eSport titles or older games anyway.
also don't forget a game as large as Zelda: Tears of the Kingdom, with all its physics simulation and entities manages to run fine on mobile hardware that was outdated when the system was still new.
 
The problem of the test is the CPU.
If someone has a bought an RTX2070 several years ago, then probably do not buy a top-notch CPU then and now.
And if someone buy a 7800X3D probably do not buy an RTX 4070 but more probably an 4080 or 4090.

This thing would be more interesting if you test a medicore CPU like Ryzen 3800 or 5700, i5-10400 or -11400.

Same could be said that if u are running an older cpu. You’re likely not upgrading to such a high end card without swapping in your highest powered compatible cpu as well ( without changing motherboard)
 
Yeah, me too. I am on a Palit GTX 1080 still and it's perfect for my needs. I game on 3440X1440 on it just fine as I play older games.

I seem to have read somewhere that something like 55% of the Steam User Base is still using GTX 1XXX cards so yeah, we are in good company.

They make it sound like all these new cards are such compelling buys but they forget that for someone who plays old games at around 100 FPS these new GPUs are completely unnecessary.

Will probably only upgrade if my 1080 goes out of order for some reason which rather unlikely seeing as how it's undervolted and cared for.

I always wondered why there's a steadfast refusal to put Skyrim and TW3 in more benchmarks, mod them up and check frametimes and stutter in different areas. Sometimes you can find youtube videos or enthusiast forum posts really getting into it, but you kind of have to get lucky and also know about it. Those games are still immensely popular and scale up to and beyond current hardware. There's probably no greater repeatable and in depth testing than modding Skyrim.

People talk out of both sides of their mouth. They say PC has the greatest library dating back decades but then throw hissy fits if someone actually enjoys playing older games and seem outright pleased that drivers or other issues come up. It's obnoxious but at this point it's just what it is.
 
I made the mistake of buying a 2070 shortly before the 2070 Supers were announced.

Then I got a 3070 at MSRP right before the mining boom.

Then I got a vastly overpriced 3080 (that paid for itself mining).

I have really hit all of the not-future-proof cards of late so I am sitting out the current offerings.
 
I have a 2070, and I really like the card. Has been very very helpful along the years.
Now watching this good comparison, I don't really feel the 4070 is enough for the price.

I feel like the card is already dated. Of course is faster than 2070, but the numbers in >1080 are borderline and even unplayable.
So why would anyone want to make this upgrade, if planning on >1080? It's been many years and the performance gain is not enough, considering the are benchmarking 3 years old games.
 
If 12GB VRAM is going to be a problem in 1-2 years, then game dev's really have to wake up, because 90-95% of PC gamers are on 12GB or less.

PS5 and XSX have 16GB RAM total for OS, Game and Graphics yet PC gamers need 32GB RAM and 16-24GB VRAM? Something is not right.

Looking forward to see PS5 Pro specs "soon"

I also find it funny, that in the majority of games I tested, textures on medium looks identical to high/ultra. Low usually lowers the quality slightly, but medium/high and ultra looks the same most of the time.

Lets see if Nvidia's Neural Texture Compression is going to be in games. Or we will need 64GB VRAM by 2026.

Yet both AMD and Nvidia put 8GB still on 7600 and 4060 series. Well I guess most PC gamers play eSport titles or older games anyway.
That's the PC tax. The abstraction layer and other inefficiencies. Because they have to "optimize" for an infinite number of systems, instead of just 4 tops. Which the console crowd always explains and the PC master race always shrugs off.
 
The problem of the test is the CPU.
If someone has a bought an RTX2070 several years ago, then probably do not buy a top-notch CPU then and now.
And if someone buy a 7800X3D probably do not buy an RTX 4070 but more probably an 4080 or 4090.

This thing would be more interesting if you test a medicore CPU like Ryzen 3800 or 5700, i5-10400 or -11400.
For sure... I'm still running a i7 4770K which is about 10 years old! I mostly play older games so I don't care but some day I'd like to play the latest M$FT Flight Sim.... won't be able to do that even with one of these cards. I'd be curious too to see how an older CPU like this does with the same type of GPU upgrade, as I and I'm sure many others haven't had many reasons to upgrade their CPU. I would have to get a new MB and RAM and possibly even PSU as well just to justify the new GPU.
 
I made the mistake of buying a 2070 shortly before the 2070 Supers were announced.

Then I got a 3070 at MSRP right before the mining boom.

Then I got a vastly overpriced 3080 (that paid for itself mining).

I have really hit all of the not-future-proof cards of late so I am sitting out the current offerings.

There are attractive non nvidia alternatives if you look at what's available objectively.
 
The trouble with this review is that you can pick up used 2070's for peanuts. As Techspot is fond of quoting the used market when comparing other new card reviews. The 4070 is a $450 card (at best) all day long.
 
They actually fixed up TLOU 1 very nicely with the latest big patch. High textures are now fine on 8GB cards and you can also turn up the texture streaming option a notch. Where 8GB cards suffer is in the rare title like Forspoken as well as instability/crashing in some games like RE4 or Jedi Survivor if you turn ray tracing on.
 
There are attractive non nvidia alternatives if you look at what's available objectively.
I switched to AMD back in the day. Did the whole triple monitor gaming - which was cool.for a while. But power consumption (fan noise really) brought me back to Nvidia.

Don’t get me wrong, I love me some AMD CPUs. But AMD isn’t pricing their GPU offerings to be attractive - just mildly competitive. They honestly don’t seem interested in gaining GPU market share. It’s unclear if that is motivated by GPU margins or CPU profits & market share.
 
Did I miss any acknowledgement of inflation? Here in the land of tea and biscuits £500 in 2018 is worth just over £600 now (we usually see £-for-$ price translations), so the value of the 4070 vs the 2070 is greater than suggested.


 
That's the PC tax. The abstraction layer and other inefficiencies. Because they have to "optimize" for an infinite number of systems, instead of just 4 tops. Which the console crowd always explains and the PC master race always shrugs off.
It does not take a very fast pc to beat PS5 and XSX even with bad optimization. Medicore hardware with good optimization yeah, but many games still crash at times (just google it). Consoles gets plenty of rushed games as well (multi-platform titles).

A console might be cheaper in the beginning, but software/games are much more expensive + you pay big for multiplayer, cloud saves and other stuff on top which s free on PC + free mods and easy tweaking of games to your liking. A console can't be compared to a PC at all IMO.

Also, console versions of games are often limited. Zero mod support. Timed exclusives for PS and XB. Developer choose how the game will look and run -> there's many forced 30 fps games already for current gen.

30 fps for me = Immersion is gone.
Forced motion blur = Simply garbage and even worse than 30 fps without it.

This is why I made a custom mITX machine for the living room, easily beats PS5 and XSX while playing every single of my 1000+ games exactly how >I want< with mods and endless posibilities. Tons of free and dirt cheap games on PC. Even Sony started bringing their games to PC pretty much every time. PS4 Pro was probably my last console because of this fact. PS4 had many great exclusives, many are on PC now, not all tho.

I prefer a PC with no locked ecosystem and complete freedom + emulators for all older consoles (y) (Y) No new "generations" making your old games not work anymore. Sony loves to resell you a remaster or remake instead of doing backwards comp... I am not a fan of buying the same game 2, 3 or even 4 times :joy:
 
Last edited:
Why wait? AMD already has the 4070 beat hands down with 33% higher performing RX 6950 XT for the same price or the same performance from an RX 6750 XT for 33% lower cost. Both options have plenty of available stock and are the primary reason why AMD hasn't released any current gen 800 or 700 tier GPUs. There is no value in any current gen Nvidia GPUs with costs as inflated as they are yet everyone still refuses the price v. performance champ that AMD has held for 5 generations.
 
If 12GB VRAM is going to be a problem in 1-2 years, then game dev's really have to wake up, because 90-95% of PC gamers are on 12GB or less.

PS5 and XSX have 16GB RAM total for OS, Game and Graphics yet PC gamers need 32GB RAM and 16-24GB VRAM? Something is not right.

Looking forward to see PS5 Pro specs "soon"

I also find it funny, that in the majority of games I tested, textures on medium looks identical to high/ultra. Low usually lowers the quality slightly, but medium/high and ultra looks the same most of the time.

Lets see if Nvidia's Neural Texture Compression is going to be in games. Or we will need 64GB VRAM by 2026.

Yet both AMD and Nvidia put 8GB still on 7600 and 4060 series. Well I guess most PC gamers play eSport titles or older games anyway.

I still don't get why everyone is so obsessed with VRAM these days. Games have been exceeding 8GB of textures for years now. Whatever isn't available in VRAM is made up for with system RAM for a latency penalty so nominal that it's only noticeable if you're already CPU bottlenecked. Even if your system RAM is limited, upgrading that is a far cry cheaper than the arm and leg needed to buy a higher tier graphics card. If you're really that worried about VRAM, AMD offers 12GB for $400, 16GB starting at $550 and 20GB for $800. Even the a770 offers 16GB for $350 as opposed to Nvidia expecting $1,200.

Rumor is that there's a 16GB variant of the 4060ti but I highly doubt it as these cards were fully manufactured and distributed before the VRAM panic began. And that's not to mention that Nvidia wouldn't produce it knowing that it would kill sales of the 4070 and 4070ti to offer a 4060ti that out specs both. This is the company that lowered the 12GB VRAM of the 3060 to only give an 8GB 3060ti after all.

8GB VRAM and 16GB system RAM will suffice for a lot longer than people realize. The budget market is flooded with reclaimed Office PCs models running Haswell procs and 6GB 1660 Super GPUs that can still pass 60 fps at 1440p on medium settings. Consoles and mobiles will have memory issues because they utilize IGPUs that share system memory. Even then, the only one to have issues as of yet is the Xbox Series S which only has 8GB total shared memory. Other systems using 16GB+ should at least last until replaced in 3 years.

Nvidia's NT compression looks interesting but there isn't a single game engine coded to use it. They'll probably charge game devs the same arm and leg rates to use it, just like DLSS, which will only drive up the cost of AAA titles. Expecting Nvidia to solve the problem when they've already established that their far too cheap to solder on more MMs is only going to make PC gaming more expensive. That's why I haven't even considered an Nvidia GPU since the GTX 770.
 
Back