We dust off the old GeForce RTX 2070 to see how it holds up against the newer RTX 4070. Of course, the idea is to measure how much faster the new GPU is, and is it a viable upgrade after 5 years?
We dust off the old GeForce RTX 2070 to see how it holds up against the newer RTX 4070. Of course, the idea is to measure how much faster the new GPU is, and is it a viable upgrade after 5 years?
“We also fear that 3070 owners will go from being shafted by 8GB of VRAM to shafted by 12GB on an even more expensive product.”
Well said!
I must admit, I'm still limping along on a GTX1060 6Gb waiting patiently for something that looks worth buying, but the way the graphics market is going, this 1060 will be around for a while still. I know I'll not get anywhere the same value for the price on any of the current, or even last gen cards. It's a very sad state of affairs.
Let's see what comes in May/June.
Yeah, me too. I am on a Palit GTX 1080 still and it's perfect for my needs. I game on 3440X1440 on it just fine as I play older games.
I seem to have read somewhere that something like 55% of the Steam User Base is still using GTX 1XXX cards so yeah, we are in good company.
They make it sound like all these new cards are such compelling buys but they forget that for someone who plays old games at around 100 FPS these new GPUs are completely unnecessary.
Will probably only upgrade if my 1080 goes out of order for some reason which rather unlikely seeing as how it's undervolted and cared for.
also don't forget a game as large as Zelda: Tears of the Kingdom, with all its physics simulation and entities manages to run fine on mobile hardware that was outdated when the system was still new.If 12GB VRAM is going to be a problem in 1-2 years, then game dev's really have to wake up, because 90-95% of PC gamers are on 12GB or less.
PS5 and XSX have 16GB RAM total for OS, Game and Graphics yet PC gamers need 32GB RAM and 16-24GB VRAM? Something is not right.
Looking forward to see PS5 Pro specs "soon"
I also find it funny, that in the majority of games I tested, textures on medium looks identical to high/ultra. Low usually lowers the quality slightly, but medium/high and ultra looks the same most of the time.
Lets see if Nvidia's Neural Texture Compression is going to be in games. Or we will need 64GB VRAM by 2026.
Yet both AMD and Nvidia put 8GB still on 7600 and 4060 series. Well I guess most PC gamers play eSport titles or older games anyway.
The problem of the test is the CPU.
If someone has a bought an RTX2070 several years ago, then probably do not buy a top-notch CPU then and now.
And if someone buy a 7800X3D probably do not buy an RTX 4070 but more probably an 4080 or 4090.
This thing would be more interesting if you test a medicore CPU like Ryzen 3800 or 5700, i5-10400 or -11400.
Yeah, me too. I am on a Palit GTX 1080 still and it's perfect for my needs. I game on 3440X1440 on it just fine as I play older games.
I seem to have read somewhere that something like 55% of the Steam User Base is still using GTX 1XXX cards so yeah, we are in good company.
They make it sound like all these new cards are such compelling buys but they forget that for someone who plays old games at around 100 FPS these new GPUs are completely unnecessary.
Will probably only upgrade if my 1080 goes out of order for some reason which rather unlikely seeing as how it's undervolted and cared for.
That's the PC tax. The abstraction layer and other inefficiencies. Because they have to "optimize" for an infinite number of systems, instead of just 4 tops. Which the console crowd always explains and the PC master race always shrugs off.If 12GB VRAM is going to be a problem in 1-2 years, then game dev's really have to wake up, because 90-95% of PC gamers are on 12GB or less.
PS5 and XSX have 16GB RAM total for OS, Game and Graphics yet PC gamers need 32GB RAM and 16-24GB VRAM? Something is not right.
Looking forward to see PS5 Pro specs "soon"
I also find it funny, that in the majority of games I tested, textures on medium looks identical to high/ultra. Low usually lowers the quality slightly, but medium/high and ultra looks the same most of the time.
Lets see if Nvidia's Neural Texture Compression is going to be in games. Or we will need 64GB VRAM by 2026.
Yet both AMD and Nvidia put 8GB still on 7600 and 4060 series. Well I guess most PC gamers play eSport titles or older games anyway.
For sure... I'm still running a i7 4770K which is about 10 years old! I mostly play older games so I don't care but some day I'd like to play the latest M$FT Flight Sim.... won't be able to do that even with one of these cards. I'd be curious too to see how an older CPU like this does with the same type of GPU upgrade, as I and I'm sure many others haven't had many reasons to upgrade their CPU. I would have to get a new MB and RAM and possibly even PSU as well just to justify the new GPU.The problem of the test is the CPU.
If someone has a bought an RTX2070 several years ago, then probably do not buy a top-notch CPU then and now.
And if someone buy a 7800X3D probably do not buy an RTX 4070 but more probably an 4080 or 4090.
This thing would be more interesting if you test a medicore CPU like Ryzen 3800 or 5700, i5-10400 or -11400.
I made the mistake of buying a 2070 shortly before the 2070 Supers were announced.
Then I got a 3070 at MSRP right before the mining boom.
Then I got a vastly overpriced 3080 (that paid for itself mining).
I have really hit all of the not-future-proof cards of late so I am sitting out the current offerings.
I switched to AMD back in the day. Did the whole triple monitor gaming - which was cool.for a while. But power consumption (fan noise really) brought me back to Nvidia.There are attractive non nvidia alternatives if you look at what's available objectively.
It does not take a very fast pc to beat PS5 and XSX even with bad optimization. Medicore hardware with good optimization yeah, but many games still crash at times (just google it). Consoles gets plenty of rushed games as well (multi-platform titles).That's the PC tax. The abstraction layer and other inefficiencies. Because they have to "optimize" for an infinite number of systems, instead of just 4 tops. Which the console crowd always explains and the PC master race always shrugs off.
If 12GB VRAM is going to be a problem in 1-2 years, then game dev's really have to wake up, because 90-95% of PC gamers are on 12GB or less.
PS5 and XSX have 16GB RAM total for OS, Game and Graphics yet PC gamers need 32GB RAM and 16-24GB VRAM? Something is not right.
Looking forward to see PS5 Pro specs "soon"
I also find it funny, that in the majority of games I tested, textures on medium looks identical to high/ultra. Low usually lowers the quality slightly, but medium/high and ultra looks the same most of the time.
Lets see if Nvidia's Neural Texture Compression is going to be in games. Or we will need 64GB VRAM by 2026.
Yet both AMD and Nvidia put 8GB still on 7600 and 4060 series. Well I guess most PC gamers play eSport titles or older games anyway.