Next-gen GPUs look big and hungry, and that's bad news

Soupreme

Posts: 33   +2
Staff
A hot potato: Gamers who thought that the current crop of RDNA2 and RTX 30-series GPUs were already demanding might be in for a surprise, if the most recent drop of leaks is to be believed.

Both kopite7kimi and Greymon55, established leakers in their own rights, seem to be in agreement that some of Nvidia's upcoming Lovelace GPUs will land somewhere north of the 400W range. These figures would presumably be for the top models of the RTX 4000 series, built on AD102 silicon, successors to the RTX 3080 / Ti and RTX 3090.

Besides pushing the GPU core as hard as possible, a great deal of this power budget will come from the continued use of the hot and power-hungry GDDR6X memory on Nvidia's top models.

Separately, Bondrewd of the Beyond3D forums left hints on Navi 31, the top SKU from AMD's RDNA3 line-up, suggesting that the multi-chiplet GPU would sit below 500W in total board power draw and below 350 mm² per graphics core die.

With an estimated size of 600-650 mm² for the two GCDs alone, and perhaps 800 mm² for the whole GPU (when including the Infinity Cache-bearing MCDs), 3Dcenter believes that Navi 31 will end up in the region of 450-480W for total board power.

Although the GPUs are nowhere near from release, these numbers are already concerning. In addition to being a bigger strain on PSUs and cooling in smaller cases, the growing power draw of GPUs is increasingly leaving gaming laptops behind.

Size and weight constraints of gaming laptops limit their cooling ability, and TDPs have remained stubbornly fixed as a result, with mobile GPUs rarely given more than 150W to work with, and that's the bulkier models that have lots of cooling.

That was good enough when the 180W GTX 1080 was the bar to beat, but the RTX 2080 called for 215W, and the RTX 3080 asked for 320W, taking gaming laptops from near-parity to only half the power of a desktop card in just a few generations.

That's excluding how growing die sizes have locked out the top end GPUs from laptops entirely, with the laptop RTX 3080 and RX 6800M actually using a lower tier silicon than their desktop counterparts. Or how partner cards routinely draw even more power reference numbers, like EVGA's FTW3 pushing the RTX 3080 to almost 400W.

Surely 400W+ GPUs are unlikely to trouble top end battlestations with immaculately designed cooling and kilowatt power supplies. But for regular gamers -- those using smaller cases, or on laptops, or keeping hold of an ol' reliable 500W PSU -- they're becoming a major problem.

Permalink to story.

 

VariableSpike

Posts: 62   +79
Also doesn't help that it will push your electricity bill up by a fair bit if you game regularly, and will mean you have to spend money on a larger power supply where getting good 80 plus efficiency is important as your load at idle (I assume) would be well below max capacity.
 

scavengerspc

Posts: 1,574   +1,587
TechSpot Elite
I don't see anything close to this in laptops, even with efficiency getting better every generation. My gaming laptop uses 2, 240 watt power bricks and I can't see how getting anymore would be possible. At least for now.
 

Dimitriid

Posts: 924   +1,700
My guess is that whichever company this rumor is mainly addressing doesn't has confidence in getting sufficient allocation on future nodes so it makes sense: Next generation they know they probably won't have miners to pick up the slack (Though it remains to be seen)

So they know gamers *traditionally* want more power without caring much about power requirements. So how do you sell them on new products without actually improving the fabrication process for the foundries? Well just slap in a couple pounds of copper on them, literally: up the power limits to about as much as they can handle.

 

VitalyT

Posts: 5,827   +5,869
Trying to figure out nVidida's way of thinking...

Banned in a few places over power consumption, hah-hah, now take that!

Looks like this guy was right on money:


darth2.png
 
Last edited:

yRaz

Posts: 3,788   +3,907
Also doesn't help that it will push your electricity bill up by a fair bit if you game regularly, and will mean you have to spend money on a larger power supply where getting good 80 plus efficiency is important as your load at idle (I assume) would be well below max capacity.
I always buy the best powersupply I can afford. I like to keep 20% about what I need and always go for atleast 80+gold. My first real gaming computer was fried the first moment I hit the power button because I bought a cheap PSU. Luckily I got an RMA on all the parts but never gain.

Same mind set as buying tires for my car. Never buy cheap tires, everything you car does it does through the tires. It doesn't matter how much horsepower you have or how good your brakes are if your tires can't handle it
 

Austinturner

Posts: 268   +316
“ Although the GPUs are nowhere near from release, these numbers are already concerning. In addition to being a bigger strain on PSUs and cooling in smaller cases, the growing power draw of GPUs is increasingly leaving gaming laptops behind.”

I mean if you are properly optimising your models for top performance in a portable form factor and a full desktop, the laptop is never going to keep up. Efficiency is great, but that efficiency should be used in desktops to push performance even further.
 

hahahanoobs

Posts: 3,596   +1,710
I'm constantly seeing RTX 3080's and custom water cooling in SFF cases and more 650w PSUs than 750-850w you used to see.

I'll wait for the final product before I start worrying about this as I usually do without regret.
 

Mr Majestyk

Posts: 813   +727
Biggest Navi 31 is in no way comparable to the 6900XT, it's 160CU's basically a dual core, so of course it will use a cr@p load of power and will cost $2k. Power efficiency is still going to be 50% higher, so like-for-like 7800XT for example at the same clocks as 6800XT would use less power, or will make ~ 50% more fps at same power. However, it has already been leaked Navi 31 will be ~ 80-100% faster, so yes power will rise but 7800XT won't use 400W.
 

Plutoisaplanet

Posts: 519   +771
This isn't even out and already banned in the great state of California 🤣🤣🤣
And ironically, California paid me $5,500 to buy my electric car and it uses up to 270kW when flooring it, which is over 500x as much power as this prospective GPU could use. Nothing wrong with having power, but California likes to decide what is the right and wrong kind of power is, even if it is connected to their "clean" power grid lol.
 

Cycloid Torus

Posts: 4,798   +1,601
RDNA2 and RTX 30-series 500w?!?? Nuts. All I want is 1/3 of that much power for about 1/5 of the price which uses about less than 130w.