The RTX 4080 and RTX 4070 could be less power hungry than first thought

midian182

Posts: 9,745   +121
Staff member
Rumor mill: One of the most repeated claims about Nvidia's upcoming RTX 4000 series (aka Ada Lovelace) is that it will be incredibly power-hungry, which isn't something consumers want at a time when energy prices and the cost of living keep rising. But if a new rumor proves accurate, the cards' power requirements might not be as bad as expected.

The news comes from regular leaker Kopite7kimi, who tweeted that the power consumption for the RTX 4070 and RTX 4080 is lower than previously reported. We'd heard that the RTX 4070 would have a total board power limit of 320W. Now, that figure is said to be 285W. The RTX 4080, meanwhile, was expected to consume a beastly 450W. That's said to have been revised down to a much more manageable 320W, the same TBP as the RTX 3080.

Rumored power consumption for Lovelace has been a concern among consumers as reports keep claiming they will be very hot and hungry. We've heard of a possible Titan card with a TDP of 800W and high-end consumer models hitting 600W, which is how much the new PCIe 5.0 12VHPWR power connector will be able to deliver.

Assuming this rumor is true, most people will likely appreciate the reduction in power consumption, given that it will mean less electricity usage, heat, and noise, but the resulting drop in performance won't be welcome. Kopite7kimi recently said that revised RTX 4070 specs mean the card will be almost as powerful as the mighty RTX 3090 Ti. We'll have to wait and see how much the lower power requirements affect this claim.

A separate Lovelace rumor comes from @KittyYukko. They tweeted that Lovelace will include a model based on a PG139-SKU340 board that uses the AD102-250 GPU. It's said to feature 14,848 CUDA cores and 20GB of GDDR6X memory, which suggests it could be an RTX 4080 Ti.

There was more Lovelace news earlier this week when Micron confirmed it was producing 21 Gbps and 24 Gbps GDDR6X modules likely heading for the RTX 4000 series, the latter of which could be part of the new Titan or another workstation card.

h/t: VideoCardz

Masthead: Caspar Camille Rubin

Permalink to story.

 
I suspect Nvidia will make this another Turing generation as they still have 3000 series cards to flog.
 
Sounds like what Nvidia 'wanted' to go with has met the cold hard reality of what it would cost in terms of power & thermals. They're now in the labs, fine tuning a reduction in performance across the range until they hit the power/thermal sweet spot that would still allow for a minor speed bump refresh of the range in late 2023.
 
I assumed that this would be the case. The rumored TBP on these cards was just too high to be taken seriously as a consumer. I understand if you are an enthusiast and want the best of the best you are willing to buy that 1KW PSU. However, if your primarily a gamer and you just want a high end GPU, having to upgrade your 750 watt PSU to 1000 watts is a bridge too far for many consumers, it is increasing the cost of the upgrade too, which simply means less sales for Nvidia. With the GPU market slumping at the moment, Nvidia cannot risk additional road blocks to consumers upgrading their GPUs. What will it mean for performance? Well, probably not all that much. High-end AIB cards maxing out the best silicon were <15% improvement for the RTX 30 series, often < 10% from the stock specs. In fact, undervolting RTX 30 often paid off in heat/power consumption dividends for very small 1-3% performance reductions from stock. Performance to power consumption is not linear at all, it seems to fall off quite a bit. I'm sure the 320 watt, compare to 450 watt performance difference just wasn't worth it to Nvidia, but that AIBs will be able to utilize that for premium cards.
 
Can't say that I would be suprised by this, the whole "IT CONSUMES 900 WATTS" sounded like a hot load of garbage and or click bait from the get go. This over estimated power draw just trickled down the whole 4000 series line up, glad to see things are finally coming back down to earth and what can actually be expected.
 
I've been waiting to buy since the mining craze started, so, I see no point in getting the 3000 series now, which are still overpriced imo. I know the era of cheap graphic cards is over, but I think, after their major quarterly loses, Nvidia can't afford to price the 4000s much more expansive and the boost in quality will probably be double. Also, the new AMD cards could be a nice surprise, so I'd advise everyone to just wait. I believe the crazy TDP rumors and the supposedly next year delays were fueled by Nvidia themselves in a desperate attempt to sell the crazy stock of 3000s, that they're sitting on right now.
 
Can't say that I would be suprised by this, the whole "IT CONSUMES 900 WATTS" sounded like a hot load of garbage and or click bait from the get go. This over estimated power draw just trickled down the whole 4000 series line up, glad to see things are finally coming back down to earth and what can actually be expected.
My thoughts exactly, the insane speculation by the internet got smothered by reality. The 4080 likely was never a 500-600w GPU to begin with. Anyone in the GPU space likely knew this already.
 
This just makes the previous reactions that much more enjoyable to look back on...

Some of my favourites were:
ATX 3.0 requirements with no announced ATX 3.0 PSU's
12-pin connectors standard based on FE card using a PCIe to 12-pin adapter cable
200w+ headroom just to correct for card spikes
Saving the planet, I mean power was more important to gamers

Rumours just prove people will believe anything.
 
Nvidia might have got away with pushing higher TDPs if GPU demand of the last couple of years was sustained, but not now, no way. It would effectively be forcing many consumers to fork out for a new psu along with their new RTX card. This would make it harder to compete with their competitors, not to mention stocks of last gen cards.

I think it's fine on their bonkers $2,000k products though, I mean if you're mad enough to spend that much then you're not going to be too bothered about power and thermals. And it serves the rest of us as a proving ground for new power management schemes.

As always it will be interesting to see how these land performance wise. Has this been intended all along or has there been some late change of strategy?
 
Nvidia might have got away with pushing higher TDPs if GPU demand of the last couple of years was sustained, but not now, no way. It would effectively be forcing many consumers to fork out for a new psu along with their new RTX card. This would make it harder to compete with their competitors, not to mention stocks of last gen cards.

I think it's fine on their bonkers $2,000k products though, I mean if you're mad enough to spend that much then you're not going to be too bothered about power and thermals. And it serves the rest of us as a proving ground for new power management schemes.

As always it will be interesting to see how these land performance wise. Has this been intended all along or has there been some late change of strategy?
It also makes it hard to develop laptop versions since you don't have as much leeway to increase the power supply not to mention the heat dissipation.
 
It also makes it hard to develop laptop versions since you don't have as much leeway to increase the power supply not to mention the heat dissipation.

Or it just makes the laptop versions lag further and further behind the performance of their similarly named (and therefore deceptively marketed) desktop parts.
 
As a resident of San Diego, where we pay more per KWh than anywhere else in the country, I'd really like my next GPU to be efficient. Also, I have no AC, just open windows, so less heat would be nice too.
 
"The RTX 4080 and RTX 4070 could be less power hungry than first thought"
I really don't think that you could set the bar any lower than you have. :laughing:
 
Sounds like what Nvidia 'wanted' to go with has met the cold hard reality of what it would cost in terms of power & thermals. They're now in the labs, fine tuning a reduction in performance across the range until they hit the power/thermal sweet spot that would still allow for a minor speed bump refresh of the range in late 2023.
There's also the matter of having to match what ATi puts out. It's pretty clear that RDNA3 is going to walk all over Lovelace when it comes to efficiency.
As a resident of San Diego, where we pay more per KWh than anywhere else in the country, I'd really like my next GPU to be efficient. Also, I have no AC, just open windows, so less heat would be nice too.
Holy hell, you pay 34¢/kWh in USD! Here in Ontario, we pay an average of 10¢/kWh in CAD! That's 7.8¢/kWh in USD! I don't know how you survive that heat without A/C but I understand why you don't have it. I guess you spend a lot of time in the ocean, eh? I know I would! :laughing:
 
I'd prefer it if Techspot did articles based on facts rather than having endless rumour mill articles, that each seem to contradict each other, and are based on what some guy called Kermit thinks might happen.
 
My thoughts exactly, the insane speculation by the internet got smothered by reality. The 4080 likely was never a 500-600w GPU to begin with. Anyone in the GPU space likely knew this already.

I think it was market prep for the inevitable future 500-600 watts, $2000, middle-range graphics cards.
The article says: "The RTX 4080, meanwhile, was expected to consume a beastly 450W. That's said to have been revised down to a much more manageable 320W". Two generations of GPU's ago those 320 watts would have been the "beastly" consumption.
 
Any way you look at it, nVidia tuning down the performance to get thermals and wattage to an acceptable level is a good thing. More importantly, allows them to bin up parts that would have otherwise failed. Maybe help with pricing.
 
My thoughts exactly, the insane speculation by the internet got smothered by reality. The 4080 likely was never a 500-600w GPU to begin with. Anyone in the GPU space likely knew this already.

I don't ever recall seeing 500-600W for the 4080 as a rumour. The insane figures always pertained to the 4090 and more so 4090 Ti.

I honestly won't be believing anything at all until these are released and benchmarked. 320W is certainly acceptable on a 4080 class card, if true and I doubt 7800XT will be less either.
 
I've been waiting to buy since the mining craze started, so, I see no point in getting the 3000 series now, which are still overpriced imo. I know the era of cheap graphic cards is over, but I think, after their major quarterly loses, Nvidia can't afford to price the 4000s much more expansive and the boost in quality will probably be double. Also, the new AMD cards could be a nice surprise, so I'd advise everyone to just wait. I believe the crazy TDP rumors and the supposedly next year delays were fueled by Nvidia themselves in a desperate attempt to sell the crazy stock of 3000s, that they're sitting on right now.
I honestly think you should consider a 3080. They are priced well at this point, in my opinion, for the performance. It's an absolutely incredible card that will last for years.
 
I honestly think you should consider a 3080. They are priced well at this point, in my opinion, for the performance. It's an absolutely incredible card that will last for years.
I would have considered that, but unfortunately, in my country, Romania, the cheapest price for a 3080 is still around 100 euros over MSRP, like 890-900 euros. Prices in US are great, the cheapest is around 750 bucks, so I would have definitely bought that. I skipped the 2000 series, so I`ve been basically waiting for 5 years to upgrade and so five, six more months won`t kill me. Unless US prices come to Europe, why buy old tech, when something better price/performance wise is around the corner?
 
I honestly think you should consider a 3080. They are priced well at this point, in my opinion, for the performance. It's an absolutely incredible card that will last for years.
I don't think the 4000 series will be as expensive (over all) as the 3000 series, yet it will perform better. Why get a 3080 that will be outperformed by a 4070, and most likely cheaper.
 
Prices in US are great,
I beg to differ :confused: The prices are still too high. A 3080 should be in the 599 Range at this point. 2 years ago I got a 2080 Super for 499. I would guess the 3080 shouldn't be much higher, but here in the USA it is still upper $700's to $800. Rediculously high.
 
I beg to differ :confused: The prices are still too high. A 3080 should be in the 599 Range at this point. 2 years ago I got a 2080 Super for 499. I would guess the 3080 shouldn't be much higher, but here in the USA it is still upper $700's to $800. Rediculously high.
I meant great compared to Europe. And I agree, there are still kinda high, 600-650 bucks/euros would have been ok for a 3080 imo, as I said I`d have probably paid even 750, but 900 is just barbaric at this time.
 
Back