Nvidia GeForce RTX 4090 could have a power limit of 800W, but a lower default TDP

mongeese

Posts: 643   +123
Why it matters: Nvidia's Ampere' successor is nearly here according to leaks, and it's going to be hot and hungry. GeForce RTX 4000 series cards, codenamed Ada Lovelace, could consume 50 to 100 W more than their predecessors across the board, with flagship models breaking new ground.

Veteran leaker kopite7kimi, who nailed the specifications of Ampere weeks before anyone else, has updated his forecasts of Ada Lovelace's power limits. It's important to note that power limits are not the same as TDPs or TBPs; they're an upper bound that only comes into play when overclocking is involved. However, they're still a good guideline for power consumption, just an overestimate.

Coming to torture our power supplies is the flagship AD102 with a limit of 800W. Going down a level, the desktop AD103 has a limit of 450W, the AD104 of 400W, and the AD106 of 260W. In the past, the xx106 has been used by entry-level GPUs and the xx104 by mid-range GPUs, but Nvidia could change that up.

As you'd hope, the leaked laptop power budgets are tamer. Kopite reports that the mobile versions of the AD103 and AD104 have limits of 175W, and the AD106 has a limit of 140W. However, because they're laptop parts, manufacturers will have a lot of leeway to tweak them anyway.

Back to the numbers, you're probably still reeling over: 800W. It's achievable with a pair of the new PCIe 5.0 power connectors, which can handle 600W each (if the card used 8-pin power connectors, it would need six!). But there aren't very many power supplies with two of those ports and that much power on the market.

Cooling will also be an interesting challenge. But, setting those practical concerns aside for a moment, is an 800W ceiling that unreasonable for a GPU with 18,432 CUDA cores (according to the information from February's hack)?

In its fully unlocked state, the AD102 will consume 0.0434W per CUDA core, which is only marginally more than the 0.0419W per core consumed by the RTX 3090 Ti. As Steve pointed out in our RTX 3090 Ti review, that GPU is a power-hungry monster, but theoretically the AD102 is only equally as awful.

And remember, 800W is the (probable) hard limit, not the TDP. A past report said that Nvidia was testing a 600W TDP for the 4090 that would improve the GPU's efficiency and leave 200W as overclocking headroom.

Permalink to story.

 
:)

To a certain owner of a 3090:

Time to change your PSU and buy a 2000 Watt one b/c like Igor Wallosek is gonna reveal, the "transient spikes" of the 4090 will reach vicinity of 1600 watts or so.

sw8pPWO.jpg
 
Given all the focus on needing to be energy efficient and with the cost of energy increasing, it would be insane for them to release a GPU with this kind of peak TDP. I think it is a fake rumour but that's my uneducated guess!
 
I am gonna pass on the 4000 line altogether. I got my RTX 3060 TI XC overclocked and running good so I am set for the next 3 years. And no I do not 4K game ....have zero interest in it and do not stream or try to keep up with Quantum P ( no disrespect intended ) so I am fine. I as a 51 year old gamer I was not happy with the whole 200% over MSRP S@#T that we have seen and that the card companies showed their true colors which is fine to an extent......but.....they kinda spit in the face of gamers.....My 2 cents
 
I'm gonna call BS, honestly. The 3090ti from the factory needs a 3 slot cooler to manage its 450 watt limit, and it hits 42db and 82c while doing so. Those who push 500-550w power limits quickly find the card smacking into thermal throttling. Third party coolers manage a bit better, but get LOUD. And heavy. 600w limits demand a water cooler.

Just going from the 3090 to the TI showed how much more power was needed for a tiny power bump. Going up another 300 watts is pure lunacy. The stock cooler for such a GPU would need to be 5 slots with 3 fans, or a standard full coverage water block. I just dont see that happening.
 
I tinking about if a 3xxx and 4xxx rtx should be inside laptops you needinga bigger power supply then you allready using today. so the pcie 5.0 must be in mini w and v. and enormous cooling. desktop easy you just put a boing airplane cooling fan over it too coll it down. case solved.
 
On that news getting puny 3090 (even at significant price premium) wasn't such a bad idea. I'll stick to my paltry 350W until things calm down a bit in a generation or two (assuming civilization will survive that long).

800W? I don't know what are they smoking. Unless it comes with portable cold-fusion reactor I'm not interested. Oh and Niagara Falls for cooling. You can't cool 800W peak on air unless you'll build a soundproof bunker in your house - or preferably 2 streets away from your house. Fermi compared to this looks like very poor relative. Honestly if this persist, CPUs for rendering won't be such a bad idea. Incoming EPYCs with ~350W and ridiculous core counts will do same job in a timely fashion and you don't have to worry about GPU(s) at all.

Now thinking of it. In servers most commonly used redundant PSUs are like 2kW. So for rendering rig, this will require basically 2 GPUs-1PSU. Nuts!!! And most rack enclosures can hold only one PSU pair. :facepalm: Anyone remembers rendering farms with 7 1080Tis/Quadros on air in one 3U mount? We moving backwards big time.
 
While nothing as high as 800W, AMD had monstrous space heaters in the past. The Radeon HD 7970 X2 and Radeon R9 295 X2 both had actual TDPs of 500W, and the Radeon R9 390 X2 was 580W. Yes, these were all dual GPU cards, but in the case of the 295 X2, they were TSMC 28LP chips, fielding a total of 5632 shaders, across 12.4b transistors. While the AD102 is going to be on the N5 node, even if the rumours about its specs are only vaguely true, it's still going to in the region of 40b transistors and operating at twice the clock of the 295 X2.

Realistically, the RTX 4090 is more likely to be 600W and as a hard limit, given that PCIe 5 have a specified current limit of 9.2A per pin, the 16 pin power connectors will be rated to 662.4W. But with no manufacturer designing a product that runs, on average, at the rated limit, I'd say it's more likely to have a TDP of 480 to 500W, given plenty of headroom for stability and overclocking.

That's still a lot of heat to shift out of the PC, of course, but if you wants ray tracing at 4K with all the bells and whistles, you're going to have accept that your hobby is going to destroy the homes of a few polar bears along the way.
 
Insane. Fermi 2 in the horizon!!!! With my 6900XT I am most of the time at 280W while gaming hard and I can feel the heat wave on my legs coming from my case. I cannot imagine with 400W or more. Nvidia's team is nuts. Buying Radeon again, it seems.
 
I am gonna pass on the 4000 line altogether. I got my RTX 3060 TI XC overclocked and running good so I am set for the next 3 years. And no I do not 4K game ....have zero interest in it and do not stream or try to keep up with Quantum P ( no disrespect intended ) so I am fine. I as a 51 year old gamer I was not happy with the whole 200% over MSRP S@#T that we have seen and that the card companies showed their true colors which is fine to an extent......but.....they kinda spit in the face of gamers.....My 2 cents
Just to complement this point I have a 3060 now but I'm the opposite: I actually game in 4k and this card is still more than enough if you use DLSS 2.0 and FSR and that's mostly for the very heavy titles like 2077 but I still get what I consider good enough performance (55 average FPS with adaptive sync enabled thanks to DLSS performance mode)

So is not like people who are actually into slower pace games like RPGs and simulators really need much more than a 3060 or 3060ti as well, it's just very pointless.

And to clarify, I'm not saying that I'm opposed to moving forward in GPU tech but I'd take far more modest increments in performance (10% or so generational upgrades and so) while scaling back power requirements. As in just give us a 3060 ti level of performance that can now be run on just 75 watts TDP instead and keep improving efficiency instead of just dumping insane amounts of power into every new generation because this isn't sustainable at all: I simply do not trust requiring that much power and cooling and feel like I would need all new parts: much more expensive and physically bigger PSU, much bigger case and because of all the heat inside the same case probably better CPU cooling as well, etc.
 
...of course, but if you wants ray tracing at 4K with all the bells and whistles, you're going to have accept that your hobby is going to destroy the homes of a few polar bears along the way.

No worries on that front. My computer doesn't run on Big Oil but Electricity so its environment friendly.
 
you're going to have accept that your hobby is going to destroy the homes of a few polar bears along the way...
Dec 2021: “New evidence shows that polar bears in regions with profound summer ice loss are doing well....

“The US Geological Survey estimated the global population of polar bears at 24,500 in 2005,” according to the State of the Polar Bear Report by Dr. Crawford. “In 2015, the IUCN Polar Bear Specialist Group estimated the population at 26,000, but additional surveys published since then have brought the total to near 30,000 and may arguably be as high as 39,000.”

“This is only a slight-to-moderate increase, but it is far from the precipitous decline polar bear experts expected given a drop of almost 50% in sea-ice levels ...

there were no reports from anywhere around the Arctic that would suggest polar bears were suffering as a result: no starving bears, no drowning bears, and no marked increases in bear conflicts with humans. Indeed, contrary to expectations, several studies have shown that polar bears in many regions have been doing better with less summer ice...
 
My guess it that Nvidia separated the 4080 and 4090 dies because they do not anticipate selling a lot of AD102s. The AD102 will be good for enthusiasts without budget constraints and for benchmarking, but I don't see it as being in many gaming computers. Nvidia will most likely focus on AD103 and AD104 production. Good to get that AD102 out there first though for the benchmarks.
 
Back