Nvidia's RTX 4070 consumes just 186W while gaming, leaked slide reveals

Shawn Knight

Posts: 15,291   +192
Staff member
What just happened? Nvidia's fourth graphics card based on the Ada Lovelace architecture is almost here. According to chatter around the water cooler, the card is currently being put through the paces by hardware reviewers ahead of a planned launch on April 13 for $599. A recently leaked presentation slide shared by VideoCardz adds even more pieces to the puzzle.

According to the slide, the RTX 4070 non-Ti will ship with 12 GB of GDDR6X memory across a 192-bit interface and 36 MB of L2 cache. That's four GB more memory and nine times more cache than the 8 GB of RAM and paltry 4 MB of L2 cache found on the RTX 3070 and RTX 3070 Ti.

Cards will also be advertised with a 200W TGP (Total Graphics Power) but on average, should consume closer to 186 watts in real-world gaming scenarios which is lower than the average consumption of both the RTX 3070 and 3070 Ti. For comparison, the standard RTX 3070 has a TGP of 220W while the Ti variant is rated at 290W. The RTX 3080 and Ti variants have a TGP up to 350W.

The slide – assuming it is legitimate – likely came from Nvidia's reviewer's guide which accompanies early samples sent out to tech sites in preparation of their coverage. It's not uncommon for slides like these to find their way online early, especially if the source can guarantee their anonymity.

Related reading: GeForce RTX 4070 Ti vs. GeForce RTX 3080 - 50 Game Benchmark

Pricing is expected to play a major role in the popularity of the RTX 4070. It'd be interesting at $599 considering the 4070 Ti starts at $200 more and the RTX 3080 has an MSRP of $699.

The 4070 Ti was originally set to arrive as the RTX 4080 12 GB at $899 but Nvidia "unlaunched" it after backlash from the gaming community. The hardware maker rebranded the card and cut the price by $100, resulting in the RTX 4070 Ti.

Permalink to story.

 
I don't see how Nvidea can sell this card for $600 without seriously gutting the sales of the 4070 Ti.
 
I don't see how Nvidea can sell this card for $600 without seriously gutting the sales of the 4070 Ti.
There should be quite a noticeable difference in performance, based on the specs -- the 4070's FP32 throughput and texel fill rate are 27% lower (pixel fill rate is 24% lower) than the Ti's, and it has 25% less L2 cache. The latter will have higher latencies too, as the clocks are lower. And since it's 25% cheaper, there's unlikely to be any cost-per-frame benefit.

It will cannibalize some sales, of course, but there should be a big enough performance gap to not affect them too much.
 
There should be quite a noticeable difference in performance, based on the specs -- the 4070's FP32 throughput and texel fill rate are 27% lower (pixel fill rate is 24% lower) than the Ti's, and it has 25% less L2 cache. The latter will have higher latencies too, as the clocks are lower. And since it's 25% cheaper, there's unlikely to be any cost-per-frame benefit.

It will cannibalize some sales, of course, but there should be a big enough performance gap to not affect them too much.
I'm thinking a 20 FPS difference at 1440P ... the resolution this card is geared for. That would put this card on par with 6900 XT at 1440.

average-fps_2560_1440.png
 
Last edited:
I'm thinking a 20 FPS difference at 1440P ... the resolution this card is geared for. That would put this card on par with 6900 XT at 1440.

average-fps_2560_1440.png

With 25% fewer resources as detailed by @neeyik, it's likely to be around 20% slower. So more like 35FPS slower compared to the 170FPS of the 4070 TI, or about 135FPS. Near or below the 3080 or 6800XT.
 
I have a 3060 and can enable at least some RTX in any game at 1080p at least so you're wrong. And I despise the practices of Nvidia and corporate greed in general, but lets be fair in our assessments and judgements.
Per Tom's Hardware, the 3060 barely eeks out 60 fps in a 14 game average with RT turned on. Most benchmarks I've seen is that with RT turned on medium you lose a lot of frames, maybe 20-50%. RT on high is even worse. If the GPU cannot deliver 60 fps consistently with RT on, then I'd say RT isn't particularly useful for that GPU.
 
Isn't $600 for the FE version only? I remember reading something about a day 1 embargo on the AIB cards and a limited run of FEs, and then the AIBs bring their stuff in which are definitely going to be $650+. $700 is probably the more likely price you can actually grab a GPU for, but who even knows these days.
 
Isn't $600 for the FE version only? I remember reading something about a day 1 embargo on the AIB cards and a limited run of FEs, and then the AIBs bring their stuff in which are definitely going to be $650+. $700 is probably the more likely price you can actually grab a GPU for, but who even knows these days.
The 4070 Ti sells for MSRP in the US and the 4080 sells below MSRP in the US.
 
Per Tom's Hardware, the 3060 barely eeks out 60 fps in a 14 game average with RT turned on. Most benchmarks I've seen is that with RT turned on medium you lose a lot of frames, maybe 20-50%. RT on high is even worse. If the GPU cannot deliver 60 fps consistently with RT on, then I'd say RT isn't particularly useful for that GPU.
Then adjust your standards. Any semblance of a sense of entitlement is evidence of unrealistic expectations in the context of life's frivolities.
 
Then adjust your standards. Any semblance of a sense of entitlement is evidence of unrealistic expectations in the context of life's frivolities.
I did adjust my standards. I won’t use RT if the GPU can’t produce at least 60fps. No one is acting entitled, only stating that I don’t expect the 4070 to have good RT performance. For the AMD crowd, that’s not a big deal. As always you have to compare to other cards in the same or similar price point.
 
I wonder what the performance will look like. I'm guessing RT will be mostly unusable.
Not necessary the case. Given that one can run games with RT on with a RTX 3070, it is highly possible the RTX 4070 will be able to deliver playable frames with DLSS/FSR on. Just don’t expect to game with RT on at high resolution.
 
So far, the 40 series has lined up very close to where you would expect it to given its FP32 performance when compared to the 30 series. There is no reason to think the 4070 would be any different and so with the rumored 5888 cores at 2.5GHz or thereabouts, we can expect it to be right around 29 TFLOPS, which is nearly identical to the 3080. The 4070 Ti comes in just under 40 TFLOPS for about a 25% increase and in testing, it is between 18-22% faster than the 3080 in gaming. My guess is that the 4070 loses to the 3080 on average at 4k, but has a slight edge at 1440p and 1080p. It'sll be $100 cheaper, use 100 less watts, and give you access to DLSS 3.0. It is definitely better than any of Nvidia's offerings so far, but it won't be an amazing value, I think we can write off amazing value this generation.

The 7800 XT will be interesting. Honestly, I would buy a 7900 XT at $800 right now before I would get the 4070 Ti for the simple fact of the 20GB of VRAM vs 12GB. If the 7800 XT came with 16GB of VRAM at $600 it would probably be a much better value if it has similar performance.
 
Last edited:
I have a 3060 and can enable at least some RTX in any game at 1080p at least so you're wrong. And I despise the practices of Nvidia and corporate greed in general, but lets be fair in our assessments and judgements.

I've been playing RT games at 1080p with an RTX A2000, roughly 3050 level performance and even then I can get over 60 in everything but CP2077. People get obsessed with Ultra everything but with optimised settings and DLSS Q (which since 2.5.1 looks fantastic at 1080p now) it's really quite easy. Hec Doom Eternal was 120-160fps, Metro EE was 65-90fps, control 70-80 and so on. CP2077 was around 50 solid and felt super playable. Some people just draw arbitrary lines in their mind where on one side something must or must-not be playable anymore, I say the proof is in the pudding.

 
Some folks here have problems with long term memory when it comes to gen leap regarding GPU's. Born this way, afflicted by COVID or by Post Nvidia Price Trauma (PNPT).
This sounds like more customers for therapists.
 
I've been playing RT games at 1080p with an RTX A2000, roughly 3050 level performance and even then I can get over 60 in everything but CP2077. People get obsessed with Ultra everything but with optimised settings and DLSS Q (which since 2.5.1 looks fantastic at 1080p now) it's really quite easy. Hec Doom Eternal was 120-160fps, Metro EE was 65-90fps, control 70-80 and so on. CP2077 was around 50 solid and felt super playable. Some people just draw arbitrary lines in their mind where on one side something must or must-not be playable anymore, I say the proof is in the pudding.
Is this bait? Playing at 1080p with DLSS and also lowering settings just for ray tracing is comical.
 
Is this bait? Playing at 1080p with DLSS and also lowering settings just for ray tracing is comical.
Is this bait? Using DLSS is comical, how original.

Did I say I lowered settings just for Ray tracing? No. I religiously run optimised settings unless my hardware is massively overpowered relative to the game and I can literally Max everything and have headroom to spare.

I am always looking for that balance, that suits my preference, for visuals to performance ratio. Turns out in a lot of games things like volumetrics, shadows etc can be turned down with little to no visual impact and hand you back disproportionately more performance than their 'worth'. Then RT in certain titles where it's done well, can utterly transform the look and atmosphere of the game, so in those I'd rather have it on and turn down other settings that make virtually no difference to the visuals, which I'd likely do irrespective of RT. This is not a hard and fast rule, every game is different and settings are picked my my discretion.
 
Is this bait? Using DLSS is comical, how original.

Did I say I lowered settings just for Ray tracing? No. I religiously run optimised settings unless my hardware is massively overpowered relative to the game and I can literally Max everything and have headroom to spare.

I am always looking for that balance, that suits my preference, for visuals to performance ratio. Turns out in a lot of games things like volumetrics, shadows etc can be turned down with little to no visual impact and hand you back disproportionately more performance than their 'worth'. Then RT in certain titles where it's done well, can utterly transform the look and atmosphere of the game, so in those I'd rather have it on and turn down other settings that make virtually no difference to the visuals, which I'd likely do irrespective of RT. This is not a hard and fast rule, every game is different and settings are picked my my discretion.
At 1080p DLSS isn't good, it's too low of a resolution for it to do it's magic, which is rendering the game at an even lower res. And then you lower the settings additionally all to have ray tracing enabled. That is not "optimizing".

Ofcourse, you can play however you want. Enjoy ray tracing at medium preset rendered at 720p.
 
Back