Nvidia GeForce RTX 4070 vs. RTX 4060 Ti 16GB: Is It Worth Spending the Extra $100?

As Steve mentioned in one of the HUB videos that they will focus a bit less on productivity benchmarking... Puget found the 4070 generally 30-50% faster than the 4060Ti 8GB with NLE video editors, and 33% faster on Blender. The really odd thing they found was that the 3060Ti actually outperformed the 4060Ti in several productivity benchmarks, which one could assume is due to the narrower memory bus - and possibly wouldn't be addressed by the higher VRAM version.

Radeon 7xxx series also appears to have made dramatic inroads with productivity (video encode/decode, Topaz AI etc) regarding parity with Nvidia performance, so IMHO it's turning into a two-horse race between the RX7800XT and RTX4070 for jack-of-all-trades system builds. Just need someone to do formal 7800XT PugetBench runs.
 
Last edited by a moderator:
If you're in that $500 ballpark range, spend $500 and get a 7800 XT and play games at the same level (or slightly better) as a 4070 that costs $550.

At least that's what I would do. Because at that price point I want a GPU that plays games at 1440p, but it also means I don't have the money to spend towards a $1000+ GPU from Nvidia that handles raytracing better than what AMD can offer while still giving proper framerate performance. Unless you're dropping money on a 7900XTX or 4080/4090, use of RT is kind of pointless. The $500 range, you can get a good GPU that plays games at 1440p well.

I also don't give a rip about DLSS or FSR, those features can eat a fat one. So, for me, I'd much rather save a few bucks and have a card with more VRAM and plays games just as well (or better) at 1440p by getting a 7800XT over the 4070.
 
As Steve mentioned in one of the HUB videos that they will focus a bit less on productivity benchmarking... Puget found the 4070 generally 30-50% faster than the 4060Ti 8GB with NLE video editors, and 33% faster on Blender. The really odd thing they found was that the 3060Ti actually outperformed the 4060Ti in several productivity benchmarks, which one could assume is due to the narrower memory bus - and possibly wouldn't be addressed by the higher VRAM version.

Radeon 7xxx series also appears to have made dramatic inroads with productivity (video encode/decode, Topaz AI etc) regarding parity with Nvidia performance, so IMHO it's turning into a two-horse race between the RX7800XT and RTX4070 for jack-of-all-trades system builds. Just need someone to do formal 7800XT PugetBench runs.
It's not odd, the 4060 is a rebranded mobile chip. Nvidia are scumbags. The 3060 is a better productivity chip because it has a wider memory bus. The idea that the 4060 has a 128bit bus is a joke and I'm not laughing
 
It's not odd, the 4060 is a rebranded mobile chip. Nvidia are scumbags. The 3060 is a better productivity chip because it has a wider memory bus. The idea that the 4060 has a 128bit bus is a joke and I'm not laughing
This is even worse that I thought... My idea was that the current 4060 is a 4050 from specs and perf. If this is a mobile part then it's lower than a desktop 4050.
 
Is It Worth Spending the Extra $100?

NO... because you the 7800xt is $50 less than the 4070 and has equal or greater performance depending on the games you play.

1440p-p.webp
 
It's not odd, the 4060 is a rebranded mobile chip. Nvidia are scumbags. The 3060 is a better productivity chip because it has a wider memory bus. The idea that the 4060 has a 128bit bus is a joke and I'm not laughing
Yes one has to tread carefully if entertaining a 4060Ti purchase because the performance gains and losses are very application specific. In this other Puget test they found even the 3060 non-Ti outperformed the 4060Ti consistently in Davinci Resolve and some tasks in After Effects. In short, I can’t think of any use case that justifies buying a 4060Ti.
 
Don’t you techspot guys get a little bored trying to stretch out reviews on this crapfest of cards? Believe me. I read them and appreciate it!

Would love to see more articles on ar glasses like xreal on forward with tips, tricks, benchmarks and spec tests on them.
 
Both cards are actually bad in value no matter who wins. The RTX 4060 Ti is probably the worst product in the entire Ada stack. AMD started this high cache and low memory bus, and it was heavily criticized at launch due to high price. So it is not just an Nvidia issue in my opinion. Nvidia sticks out more this time because AMD's RX 7600 is quite a fair bit cheaper. 500 bucks for a card that is built for mainly 1080p is too much even with double the amount of VRAM.
 
I, for once, considered one of these cards to replace my old card.. but when I looked many "not so good" reviews about them, I started to look away and try to find better card..
 
I am still playing everything with my good old Vega64. So far all the games I played it managed to run it on ultra setting (1024).
I think it still has a few years in it…
 
I love the dual fan ones, but at $499, it's close, but I believe the 4070ti should be the $499 card. And nobody can say it's because prices of making the cards have gone up. Read up on nVidia's profit margin.
Hence why I try to shed light on price drops for potential customers, I would argue it's more effective than pediatric level complaints about the prices, like we have any control over disturbance of pricing from the like of crypto mining craze and now ai compute. I'm am not a shareholder so I really do bother with the profit margin spreads and target revenue figures. Although I hear it's 10x roi on ai dedicated hardware for team Green currently. Now that AMD learned the smoke and mirrors trick that is frame gen, Nvidia's bedazzled affects are abating and thus the price corrections in some markets are in fruition.
 
Last edited:
Every single time I see ‘4060 Ti 16GB’ I imagine Chris Farley doing “Fat RAM, Little GPU” a la Tommy Boy.
 
As Steve mentioned in one of the HUB videos that they will focus a bit less on productivity benchmarking... Puget found the 4070 generally 30-50% faster than the 4060Ti 8GB with NLE video editors, and 33% faster on Blender. The really odd thing they found was that the 3060Ti actually outperformed the 4060Ti in several productivity benchmarks, which one could assume is due to the narrower memory bus - and possibly wouldn't be addressed by the higher VRAM version.

Radeon 7xxx series also appears to have made dramatic inroads with productivity (video encode/decode, Topaz AI etc) regarding parity with Nvidia performance, so IMHO it's turning into a two-horse race between the RX7800XT and RTX4070 for jack-of-all-trades system builds. Just need someone to do formal 7800XT PugetBench runs.
Not many people care about Puget when it comes to graphics cards, its all about gaming be happy you have that.
 
If you're in that $500 ballpark range, spend $500 and get a 7800 XT and play games at the same level (or slightly better) as a 4070 that costs $550.

At least that's what I would do. Because at that price point I want a GPU that plays games at 1440p, but it also means I don't have the money to spend towards a $1000+ GPU from Nvidia that handles raytracing better than what AMD can offer while still giving proper framerate performance. Unless you're dropping money on a 7900XTX or 4080/4090, use of RT is kind of pointless. The $500 range, you can get a good GPU that plays games at 1440p well.

I also don't give a rip about DLSS or FSR, those features can eat a fat one. So, for me, I'd much rather save a few bucks and have a card with more VRAM and plays games just as well (or better) at 1440p by getting a 7800XT over the 4070.
And be stuck with AMD's crappy and always late upscaling, frame interpolation and ray tracing? Nah, thanks.
 
And be stuck with AMD's crappy and always late upscaling, frame interpolation and ray tracing? Nah, thanks.
To each their own, but.....

If you're worried about those features and need Nvidia or AMD to save your FPS because you're not pulling enough to run the game with normal rasterization, then you're already doing it wrong because you're not in the right dollar amount for a GPU that's good enough for your needs. You should be looking at the $900+ range and getting a 7900 XTX or 4080 or 4090.

If you're planning on getting a $500 range GPU and expecting it to handle everything for you for a few years, even with the tricks and gimmicks they need to increase the FPS, a $500 GPU isn't going to last you very long. That is, unless you're gaming at 1080p. If that's the case, this $500 range should be good enough, but if you're 1440p or 4K, you'll need better.
 
And be stuck with AMD's crappy and always late upscaling, frame interpolation and ray tracing? Nah, thanks.
I was a big fan of ray tracing, but it just isn't worth the frame rate hit on every game I play. With that said I'm not going to buy a GPU when they've gotten this expensive. I'm going to let both AMD and NVIDIA know they are too expensive by not buying.
 
Back