GeForce RTX 4070 vs. RTX 2070: Worthy Upgrade or Not?

"We also fear that 3070 owners will go from being shafted by 8GB of VRAM to shafted by 12GB on an even more expensive product."

You only feel that way once you pay more than $280-300 for a 3070. I don't feel "shafted" for my 3070 8 GB vram since I pay $300 for it.
 
If 12GB VRAM is going to be a problem in 1-2 years, then game dev's really have to wake up, because 90-95% of PC gamers are on 12GB or less.

PS5 and XSX have 16GB RAM total for OS, Game and Graphics yet PC gamers need 32GB RAM and 16-24GB VRAM? Something is not right.

Looking forward to see PS5 Pro specs "soon"

I also find it funny, that in the majority of games I tested, textures on medium looks identical to high/ultra. Low usually lowers the quality slightly, but medium/high and ultra looks the same most of the time.

Lets see if Nvidia's Neural Texture Compression is going to be in games. Or we will need 64GB VRAM by 2026.

Yet both AMD and Nvidia put 8GB still on 7600 and 4060 series. Well I guess most PC gamers play eSport titles or older games anyway.

PC gamers definitely don't need even 8GB VRAM; my GTX 780Ti and 290X cards can play current games at low to medium settings @1920 and 3440; even my old GTX 580 3GB can play many current DX11/12 games at medium/low settings 720p/1080p. My 980Ti and Fury cards with 6GB &4GB are fine for medium to high 1920/3440.

The average person doesn't care about 100FPS max settings; why do you think 80s and 90s consoles were so popular with low FPS.
 
All in all:

- most games can be perfectly played with a mid-range 20- or 30-series with 8 Gb of vram and mid to high level graphic unless people really have high demands and search for the differences

- manufactures with a push ($$$) towards game companies try to artificially time limit the use of a graphics card so that you feel that you own a 12 year-old card.
Do people notice a difference, on a 65" 4K TV, between a 4K 85% jpg photo or a 100 megapixel converted to 4K raw image, this changing 60 times a second? No.
The same effect is seen when people have dolby vision vs hdr10, 1440p vs 4K, DLSS 1440p @ 4K vs real 4K, good jpg vs raw, etc. People playing games hardly notice any difference, it's not the same as just watching still images.

- I will replace my RTX 3060Ti 8 Gb when I find a 4070 ti equivalent with 16 Gb of vram for 500€ or less and that may happen with the 50 or 60 series. I'll wait, I still have fun at 1440p or 4k
 
PC gamers definitely don't need even 8GB VRAM; my GTX 780Ti and 290X cards can play current games at low to medium settings @1920 and 3440; even my old GTX 580 3GB can play many current DX11/12 games at medium/low settings 720p/1080p. My 980Ti and Fury cards with 6GB &4GB are fine for medium to high 1920/3440.

The average person doesn't care about 100FPS max settings; why do you think 80s and 90s consoles were so popular with low FPS.
Because people had low demands back then? I played games at crappy framerate for years, and I drove a shitty car for years as well - Now I don't.

People also watched movies in black and white as well in the 50s and 60s. That does not mean it's great.

The average person would def. care if they actually saw 30fps @ 60Hz side by side with 100-200fps @ 240Hz. Day and night difference and immersion goes from low to high.
 
Last edited:
I still don't get why everyone is so obsessed with VRAM these days. Games have been exceeding 8GB of textures for years now. Whatever isn't available in VRAM is made up for with system RAM for a latency penalty so nominal that it's only noticeable if you're already CPU bottlenecked. Even if your system RAM is limited, upgrading that is a far cry cheaper than the arm and leg needed to buy a higher tier graphics card. If you're really that worried about VRAM, AMD offers 12GB for $400, 16GB starting at $550 and 20GB for $800. Even the a770 offers 16GB for $350 as opposed to Nvidia expecting $1,200.

Rumor is that there's a 16GB variant of the 4060ti but I highly doubt it as these cards were fully manufactured and distributed before the VRAM panic began. And that's not to mention that Nvidia wouldn't produce it knowing that it would kill sales of the 4070 and 4070ti to offer a 4060ti that out specs both. This is the company that lowered the 12GB VRAM of the 3060 to only give an 8GB 3060ti after all.

8GB VRAM and 16GB system RAM will suffice for a lot longer than people realize. The budget market is flooded with reclaimed Office PCs models running Haswell procs and 6GB 1660 Super GPUs that can still pass 60 fps at 1440p on medium settings. Consoles and mobiles will have memory issues because they utilize IGPUs that share system memory. Even then, the only one to have issues as of yet is the Xbox Series S which only has 8GB total shared memory. Other systems using 16GB+ should at least last until replaced in 3 years.

Nvidia's NT compression looks interesting but there isn't a single game engine coded to use it. They'll probably charge game devs the same arm and leg rates to use it, just like DLSS, which will only drive up the cost of AAA titles. Expecting Nvidia to solve the problem when they've already established that their far too cheap to solder on more MMs is only going to make PC gaming more expensive. That's why I haven't even considered an Nvidia GPU since the GTX 770.

Well 4060 Ti 16GB is pointless because 4070 will beat it anyway - and I bet they will be priced close. 12GB VRAM is plenty for 1440p gaming. 8GB is plenty as well for 99.9% of games.

A weak GPU with more VRAM is never preffered over a faster GPU with slightly less VRAM. GPU is the limiting factor in most cases and when GPU is weak, you are NOT going to push settings up anyway - lower settings = less VRAM needed.

Just look at 3060 12GB vs 3060 Ti 8GB. You can max out 8GB in a handful of games with ultra settings (maybe with RT as well), but the 3060 12GB will be able to fill the VRAM, YET the GPU is too slow to run the games on MAX SETTINGS anyway = Pointless.

Nvidia simply gives people an option for 16GB on 4060 series, for more money.
16GB on a lower mid-end level GPU is really just waste.

People should test MEDIUM vs ULTRA preset in most newer games. The difference is VERY SMALL at times. MEDIUM even looks better sometimes (less crap like DOF, MOTION BLUR etc). The difference between medium and ultra is laughable in most newer games really.

Example - https://www.techpowerup.com/review/dead-space-benchmark-test-performance-analysis/4.html
 
Last edited:
The RTX 4070, on the other hand, arrived 2.5 years after the RTX 3070, and while it has upgraded the VRAM capacity to 12GB – still a minor improvement in our opinion – it costs $100 more, and for that, you're only getting a ~30% performance boost.

But, you didn't consider inflation. $500 in 10/2021 is about $582 in May 2023. So, you're getting 30% more performance, 50% more Vram and lower power consumption for about the same price. I would bet that if you up the power on the 4070 you could eek out another 5-10% more performance.
 
I'm still limping along on a GTX1060 6Gb waiting patiently for something that looks worth buying
I was also hanging on with a 1060 6GB but recently upgraded to an RX 6650 XT as they'd recently dropped in price. I'd like to say I noticed a huge difference but, in all honesty, I didn't. It might be because I had to under volt the GPU to stay within the 150W my system could feed it., it might be because the rest of my system is 10 years old but most likely it's because my eyes are 60 years old and nothing these days looks hi resolution :(

It scratched an itch anyway.
 
I must admit, I'm still limping along on a GTX1060 6Gb waiting patiently for something that looks worth buying, but the way the graphics market is going, this 1060 will be around for a while still. I know I'll not get anywhere the same value for the price on any of the current, or even last gen cards. It's a very sad state of affairs.

Let's see what comes in May/June.
that is wise choice. all the mainstream news benchmark only serve for advertisement purpose urging end users to buy the latest **** and more irrelevant updates. they don't really make sense and may never will. the way these mainstream benchmark is biased neglecting the reality. nobody needs to read any of these online benchmark and already know new product is always noticeably better in the way it is marked and marketed. only end users with enough awareness and understand what they are doing know where are the high-end choices.
with that knowing base, 1060 is pretty decent and probably outperforms the newest gig in most cases if with the right version driver and HD resolution.
 
Back