GeForce RTX 4070 vs. 4070 Ti: $600 or $800 GPU Upgrade

Good find however, their wattage numbers says GPU only? "Card Only"

13900K can use alot of watts, in gaming tho, it's probably "only" 100-150 watts tops
TechPowerUp records power consumption directly, rather than using any onboard or integrated sensors. In the case of their 13900K review, the figures are taken from the two ATX 12V sockets on the motherboard -- these supply bulk of power for the CPU.

Why did AMD do this? Is 3D cache expensive?
The process of adding the additional layer of cache is probably the biggest additional cost, but the extra silicon limits the thermal properties of the CPU. For example, the Tjunction maximum for the 7950X is 95C whereas it's 89C for the 7950X3D. Applying the cache to both CCDs would lower this, and the clock speeds, even further -- AMD probably found that the decreases negated some/a lot of the benefits the cache brings.
 
I’m su full about your AMD sponsorship. At the end of every comparison you are suggesting 6800XT or 6950XT as a viable option. No they are not ! No DLSS3, and the prices are low because people don’t buy them ! A 6950XT consumes significantly more power than a 4070Ti.
You keep suggesting that 12 GB VRAM are “not enough”, and your results said something different (at the target resolution of 1440P the 4070’s are perfectly fine). And now I’m curious to see if your are going to retest The Last of Us after che 49 GB patch released today where one of the main point is REDUCED VRAM IMPACT that should allow most of the players (their words) to raise texture quality. So they admittedly poorly optimized the game, but you used this game as an example on how bad Nvidia is.
I HATE Nvidia, it is the worst Company on the planet, in this field, but you must stop worshipping AMD because this channel is great, but you are losing credibility.
Calm down.

Not everyone cares about DLSS (or FSR), let alone version 3 that's been pigeonholed to Ada only and not even widely supported. Not everyone likes Nvidia, either.

Snatching up a 6950XT will surely cost you a little more in electricity, but nothing that'll be noticed without running the system 24/7 at max power level. Even if it was a 200W difference, it's only a big concern when it comes to card power consumption when you're comparing things and needing to make sure you have a proper PSU to support everything. However, in terms of actual cost it's not that much for most people to absorb.

Average gaming watts used for a 4070Ti (according to TPU) is 280W. Average watts for an ASUS 6950XT TUF (according to TPU) is 340W. So, 60W difference on average.
4070Ti takes 3.57 hours of average gaming to hit a 1KWh.
6950XT takes 2.94 hours of average gaming to hit a 1KWh.

We'll say you game 5 hours a day (very dedicated gamer) 5 days out of the week.
With a 4070Ti that's 1.4KWh a day, so 7KWh a week.
With a 6950XT that's 1.7KWh a day, so 8.5KWh a week.
52 weeks a year....
4070Ti uses 364KWh
6950XT uses 442KWh

Depending on your total cost for electricity, the price will vary. For me, my local cost is 11.2 cents per KWh, but we'll round it up to 12.
364KWh x .12 = $43.68 a year.
442KWh x .12 = $53.04 a year.
$9.36 difference.
If you feel that difference is that large, then do something simple like not buying one or two coffees during the whole year or avoiding buying a single pack of cigarettes or don't buy a few 12packs of pop. There's always some option you can do to make sure you save up those few extra dollars. Granted, this is just the cost difference in my area, it could be more or less for others.
 
Good find however, their wattage numbers says GPU only? "Card Only"

13900K can use alot of watts, in gaming tho, it's probably "only" 100-150 watts tops
Is GPU only, measured from PCIE slot. Just that the MB is different, and MB AIBs quite often take many shortcuts to lower the costs. And my hint is that a power hungry processor will have an influence even if it is small too, but enough to show different power spike results.
 
Last edited:
TechPowerUp records power consumption directly, rather than using any onboard or integrated sensors. In the case of their 13900K review, the figures are taken from the two ATX 12V sockets on the motherboard -- these supply bulk of power for the CPU.


The process of adding the additional layer of cache is probably the biggest additional cost, but the extra silicon limits the thermal properties of the CPU. For example, the Tjunction maximum for the 7950X is 95C whereas it's 89C for the 7950X3D. Applying the cache to both CCDs would lower this, and the clock speeds, even further -- AMD probably found that the decreases negated some/a lot of the benefits the cache brings.
I read 13900K review and it is interesting that for processors, TPU did not measured power spikes as they did with videocards. Maybe measuring power spikes for both CPU and GPU will be something to look up and test next.
 
[HEADING=2]"GeForce RTX 4070 vs. 4070 Ti: $600 or $800 GPU Upgrade?"[/HEADING]

Neither.

RTX 4070 should have been around $400, 4070Ti around $500 and the 4080 around $650 and I am being too generous towards NVIDIA.
Well, the GTX 780 was $650 in 2014 - with its glorious 3GB VRAM and whopping 2300 cores. So a current model, with over 5X more VRAM and an almost incomprehensible leap in performance (8X?) is not going to cost the same as something from 9 years ago.
 
Here I was thinking the cards being talked about was 4070 vs 6950XT, not the 4070Ti, which competes in a totally different price tier atm. The power differences will be larger, as will the +200w heat you're putting into your room with the 6950XT.

And for the "where FSR mentioned?" people, why would that be a pro for the 6950XT? you can use it on anything, so it's a 100% moot point to consider when making this purchase decision. The 4070's have a demonstrably superior option which the 6950XT does not, that's why it gets mentioned, it's only worth talking about advantages unique to either option.
 
The Inno3D or? I can find zero reviews on it
There's no review for the Inno3D 4070 Ti X3 OC model I have. But you can find reviews for the 4070 Ti iChill model with a different cooler or for 30xx X3 OC models, which look like they have the same cooler. The card works without any problems. Temps are OK and fan noise blends in with other fans of my setup.
 
There's no review for the Inno3D 4070 Ti X3 OC model I have. But you can find reviews for the 4070 Ti iChill model with a different cooler or for 30xx X3 OC models, which look like they have the same cooler. The card works without any problems. Temps are OK and fan noise blends in with other fans of my setup.
Yeah its a fine card - however 4070s are way shorter than 4070 Ti so if lenght is a problem, 4070 can be the solution. Some 4070s are seriously compact + 200 watts. I'd buy 4070 Ti myself for sure tho, it's a great GPU.
 
Last edited:
Yeah its a fine card - however 4070s are way shorter than 4070 Ti so if lenght is a problem, 4070 can be the solution. Some 4070s are seriously compact + 200 watts. I'd buy 4070 Ti myself for sure tho, it's a great GPU.
I agree, I almost didn't fit the 4070 Ti into my Node 304 case.
 
Why are they still insisting with that "Ti" in the name of these cards, when there is no GTX/RTX Titan higher in the product stack? To me this is false advertising, to trick the people who still remember what a GTX Titan was.
 
Why are they still insisting with that "Ti" in the name of these cards, when there is no GTX/RTX Titan higher in the product stack? To me this is false advertising, to trick the people who still remember what a GTX Titan was.

Ti has nothing to do with Titan. Ti = Titanium - Moniker existed for many years, long before Titan series.

Titan line was probably stopped because it was pointless and sales were terrible. I clearly remember Titan cards ... Worst value cards you could buy for money. With x90 moniker, it is not needed anyway.

I like the Ti branding.
I hated the SUPER branding on the other hand..
 
Last edited:
Ti has nothing to do with Titan. Ti = Titanium - Moniker existed for many years, long before Titan series.

Titan line was probably stopped because it was pointless and sales were terrible. I clearly remember Titan cards ... Worst value cards you could buy for money. With x90 moniker, it is not needed anyway.

I like the Ti branding.
I hated the SUPER branding on the other hand..
Is that the explanation, because the metal Titanium exists in the periodic table of elements? :)
I recommend Pb from Plumbum, it's right there next to Titanium and it is way more appropiate, both for Nvidia and for hateful, toxic fans.
 
Is that the explanation, because the metal Titanium exists in the periodic table of elements? :)
I recommend Pb from Plumbum, it's right there next to Titanium and it is way more appropiate, both for Nvidia and for hateful, toxic fans.
Maybe, my first Geforce "Ti" was a Geforce 3 "Ti500" :laughing: 22 years ago
 
You didn't mention which card models you did this testing with, which is a bit weird since there was no 4070 Ti FE but 4070 FE exists... comparing an AIB model with an FE model wouldn't make sense right?
 
Interesting article. Makes me a bit sad I went "Ti"

I've always bought midrange and changed frequently as that way you can maximise return on resale value. This worked well from GTX4XX to GTX9XX.
But then the 10 series happened and I went to amd5700xt
I skipped the 20 series.

To get £300+ for my 3070 I had to sell the other month before the non ti was available.
If I'd sold before Last of Us came out I would have got 400.

NVidia have been a bit cheeky with planned obsolescence before with 3.5gb vram on the 970. (AAA titles at max setting at 1080p often use 3.6-3.8GB)

TBF the 4070ti is good. I like 60fps with as much eye candy as I can get in immersive games. The extra 15% headroom at 4k does keep things in sync with my monitor.
 
You didn't mention which card models you did this testing with, which is a bit weird since there was no 4070 Ti FE but 4070 FE exists... comparing an AIB model with an FE model wouldn't make sense right?
There's barely any difference between the cards. The worst and cheapest one, might perform 2% less than the most expensive one, as long as chip is identical.
 
Interesting article. Makes me a bit sad I went "Ti"

I've always bought midrange and changed frequently as that way you can maximise return on resale value. This worked well from GTX4XX to GTX9XX.
But then the 10 series happened and I went to amd5700xt
I skipped the 20 series.

To get £300+ for my 3070 I had to sell the other month before the non ti was available.
If I'd sold before Last of Us came out I would have got 400.

NVidia have been a bit cheeky with planned obsolescence before with 3.5gb vram on the 970. (AAA titles at max setting at 1080p often use 3.6-3.8GB)

TBF the 4070ti is good. I like 60fps with as much eye candy as I can get in immersive games. The extra 15% headroom at 4k does keep things in sync with my monitor.

4070 Ti is a great card. 3090 Ti performance at half the watts pretty much, with DLSS 3 support.

You won't be running games at ultra settings at 4K for years anyway (without DLSS) and when you lower settings, the VRAM usage drops very fast. Any person that runs 4K knows how to tweak settings, or they will eventually - because even the most high-end GPU will struggle in a few years here.

Stuff like lowering Shadows from Ultra to Medium can bump up framerate like crazy and lower VRAM usage ALOT.

Trying to future proof with VRAM is not the way to go. Just look at 1080 Ti performance today - GPU is very weak now, and the limiting factor, even tho it has 11GB VRAM which is "enough" even by todays standards.

Also, take a look at RTX Titan 24GB, released back in Late 2018. 24GB VRAM yet performs like a 3070 8GB in 2023. It has the VRAM to be relevant, but the GPU is dated. This is why alot of VRAM is not adding much to longevity in the end anyway. You are better off buying a cheaper card and upgrading more often. Stay on the recent architectures, when you drop 3+ generations behind, it's game-over in new aaa games regardless of how many VRAM you have. GPU is simply too weak at this point.
 
Last edited:
4070 Ti is a great card. 3090 Ti performance at half the watts pretty much, with DLSS 3 support.

You won't be running games at ultra settings at 4K for years anyway (without DLSS) and when you lower settings, the VRAM usage drops very fast. Any person that runs 4K knows how to tweak settings, or they will eventually - because even the most high-end GPU will struggle in a few years here.

Stuff like lowering Shadows from Ultra to Medium can bump up framerate like crazy and lower VRAM usage ALOT.

Trying to future proof with VRAM is not the way to go. Just look at 1080 Ti performance today - GPU is very weak now, and the limiting factor, even tho it has 11GB VRAM which is "enough" even by todays standards.

Also, take a look at RTX Titan 24GB, released back in Late 2018. 24GB VRAM yet performs like a 3070 8GB in 2023. It has the VRAM to be relevant, but the GPU is dated. This is why alot of VRAM is not adding much to longevity in the end anyway. You are better off buying a cheaper card and upgrading more often. Stay on the recent architectures, when you drop 3+ generations behind, it's game-over in new aaa games regardless of how many VRAM you have. GPU is simply too weak at this point.
I am getting 35-40 fps @ 1080p in The Witcher 3 on a GTX 670/Intel i7-860/DDR3 1600 system with most settings on high or ultra (no hairworks). The GTX 1080 Ti would output at least 80 fps.
 
I am getting 35-40 fps @ 1080p in The Witcher 3 on a GTX 670/Intel i7-860/DDR3 1600 system with most settings on high or ultra (no hairworks). The GTX 1080 Ti would output at least 80 fps.
Yeah maybe, but I am running 1440p and demand 100+ fps on high settings
My 1080 Ti was replaced years ago because it was too slow to deliver this in most new games
 
I sincerely dont understand how out of the blue, Steven has decided to push DLSS and RT so hard without even warning that the sole existence of DLSS is to keep you locked into nvidia hardware.

Its a disservice to games, gamers and the industry and simply pushes the industry to a horrible monopoly.

And RT is simply not there yet, specially now with PT.
When AMD, nvidia, Intel and whoever else left can sell a gpu that produces proper PT games at 120 FPS@4K without any gimmicks and lock in tech like DLSS (FSR is another gimmick but at least is not locked into one GPU vendor) AND it cost less than US$500, then we should be spending this much energy in pushing this.

I don't even look at the DLSS benchmarks since I don't want to use fake frames in the first place. That has made the hardware of this generation look even less appealing.
 
Fact 1) the current gen is more optimized (better design, smaller process node) and offers better performance/Watt. DLSS is also the better method, wider support and is very useful; RT is also the current and more supported method, also for many years from now and Nvidia chips perform much better than AMD's.

Fact 2) they try to make the current gen better on performance but with armed to end fast (= crippled RAM amount and bus)

Fact 3) they increased the prices substantially thinking that we are idi.ts, not on the same amount as the costs, but much higher due to good sales with very inflated prices on the last generation (crypto miners and AI)

Fact 4) due to fact 3 and 4, I refuse to buy this generation - or any other in the future - as long as this kind of attitude remains.
 
Back