Next-gen GPUs look big and hungry, and that's bad news

I think as competition heats up, we start to see companies trying to stretch the performance of their chip. This issue is also prevalent in the mobile SOC space where we are starting to see very high power draw, even for a short duration. With no fab advantage since competitors' product is produced on node with similar efficiency, the only way chip companies can do is to try and increase frequency at the expense of power consumption. Even with 10nm, Intel is rumoured to push Alder Lake out with a PL2 of 228W. I can imagine at some point, high end gaming rig will need to be completely water cooled or almost open air to keep the temps in check considering CPU can go as high as 228W (which may go even higher in real life) and GPUs that require some 450 to 500W.
 
OK. Let's put things into perspective.

350W 3090 vs 365W GTX590. So 590 had 2 GPUs on it, but you wouldn't play a single new game on it today. 24GB VRAM vs 3GB. 3090 does same amount of job as 4 1080Tis using ONLY 350W not 1kW+ and it doesn't choke at very high res renders. I have no issue with 1 card drawing 350W if it can do the work of multiple VGAs from previous generation.

500W 3090 that's ludicrous. Anyone who OC 3090 is insane.

As a curiosity. I have one 3D garden module which chokes 3090 to death. I thought that 1080Tis were weak, but it seems that amount of reflections, surfaces, leaves and all other organic matter is so enormous which is enough to bring even 3090 to its knees when doing live render in the Viewport.
 
At stock clocks maybe.
you overclock the thing kiss that good bye. you looking at 250 watt +
I'm not sure what kind of top model extreme GTX 1080 OC you have, or are referring too, but my middle range GTX 1080 with 2025Ghz core OC and +800Mhz on memory - I've never seen it go past 225w. And that's peak, it does not stay there all the time, just sometimes goes up to that.

Mine does not have RGB, so maybe that's why./s
 
I'm not sure what kind of top model extreme GTX 1080 OC you have, or are referring too, but my middle range GTX 1080 with 2025Ghz core OC and +800Mhz on memory - I've never seen it go past 225w. And that's peak, it does not stay there all the time, just sometimes goes up to that.

Mine does not have RGB, so maybe that's why./s

I never even overclocked my Palit GTX 1080 Gamerock. Didn't need to.

Since I have the RM 650I PSU fm corsair, I can see power draw in real time. Total system power has never exceeded 290W and total system power again is usually around 200W when gaming and 45W when idle.

4770K, Maximus VI Hero, assorted HDD's & SSD's.
 
As long as the performance/watt improves I don't see a problem. You can always underclock or buy a video card that draws less power. Also earlier a top configuration included a SLI or CF setup, which would draw the same amount of power.
 
I never even overclocked my Palit GTX 1080 Gamerock. Didn't need to.

Since I have the RM 650I PSU fm corsair, I can see power draw in real time. Total system power has never exceeded 290W and total system power again is usually around 200W when gaming and 45W when idle.

4770K, Maximus VI Hero, assorted HDD's & SSD's.
I get at least +10% perf out of that OC and considering today's games that's the difference between being under 60 fps (no OC) or above 60fps (with OC), in quite a few games actually.

I rather use it OC and play on High settings than stock on Medium settings.

But it's also game dependent and not all games are so demanding. For example I play maxed out in RE:V and it does not need the OC for that at all. Nor do I need FSR ON. I'm very impressed with that game and optimization (with latest patch only, after the DRM issue fix, LOL). It looks great and runs great and it's a new game.
 
All the leakers are saying the next cards are going to beasts - more than twice as fast.
If so , will game developers tap this power in the next year or 2 - think RT will probably take a big jump - maybe just ability to game at 4K 120Hz in ultra settings +FSR/DLSS upping rate to 160 in ultra mode.
Unless lots hit the market at a reasonable cost - then how much of the market will the buyers be .
The prediction is the next midrange cards of AMD/Nvidia will equal 3080s - that's great - but what will it cost? - there is still a lot of folks wanting $200 to $400 cards - and $500 gets you a consol.
Saying that maybe the top card buyers - buy AAA games at full price
 
Personally, I don‘t really care about the top of the line models‘ power consumption as they‘re not for me.

As long as lower tier models with a good power consumption (perf/watt) exist and prices are acceptable (I.e. no €300+ 1080p cards) , I‘m fine.
 
Look at the 3090 pulling over 500 watts pure madness.

6900XT can easily spike at 500 watt too. I have seen 650 watt spikes on some custom 6900XT AIO boards... And 550-600 on some 3090 .. This is on 3x8 pin cards obviously, hell I think there is even 4x8 pin custom cards...

Undervolting is a thing tho ... 3090 and 6900XT can easily be kept below 400 watt spikes while keeping performance on par with factory oc..

Top cards are pointless for most people anyway, both 3090 and 6900XT have TERRIBLE performance per value. Why pay like 100% more for 5-10% more performance and higher powerdraw?
 
Last edited:
At this point, I`m more concerned if I`ll ever be able to buy one. Scalping/ mining it`s a thing now, so I don`t expect any "normalcy" any time soon. Sure, the supply may improve, but the prices could hover around 2xMSRP for a long time.
 
6900XT can easily spike at 500 watt too. I have seen 650 watt spikes on some custom 6900XT AIO boards... And 550-600 on some 3090
Those are defective boards. I can't even see how a GPU that has a 300-350 watt TDP could survive that kind of power draw. And the 3090s are only about 45 watts higher Max.


I'm not posting that as AMD vs Nvidia. This link is an excellent write-up on power draw.
 
Last edited:
"But for regular gamers -- those using smaller cases, or on laptops, or keeping hold of an ol' reliable 500W PSU -- they're becoming a major problem."

The people who aren't even thinking of buying a 4090Ti for their small case and good ol' 500W power supply. I drive a car to work every day, not a 18 wheeler.
 
I think as competition heats up, we start to see companies trying to stretch the performance of their chip. This issue is also prevalent in the mobile SOC space where we are starting to see very high power draw, even for a short duration. With no fab advantage since competitors' product is produced on node with similar efficiency, the only way chip companies can do is to try and increase frequency at the expense of power consumption. Even with 10nm, Intel is rumoured to push Alder Lake out with a PL2 of 228W. I can imagine at some point, high end gaming rig will need to be completely water cooled or almost open air to keep the temps in check considering CPU can go as high as 228W (which may go even higher in real life) and GPUs that require some 450 to 500W.

That's the key. When Pascal was released, AMD had nothing close. 180W and the 1080, then 225W on the 1080Ti were enough to dominate. To beat the 1080Ti ($700) with Turing, Nvidia had to resort to a $1300 card.

Competition is making Nvidia and AMD push the chips to their limit at the high end. That's why the vBIOS has power limits now and not just temp limits - to keep the chips from fizzling out.
 
It’s actually absurd to have a desktop tower PC that holds components that could go in a laptop. At least these companies are making use of the extra power and space that desktops have to push more performance.

But higher power consumption in 2021 when much of energy comes from carbon producing sources is bad news. Maybe Nvidia or AMD could ship the GPUs with an extra ferocious hamster and a wheel to help offset the carbon cost of running their GPUs?

Or you know we could just tax carbon emissions. I really have no idea why we don’t at the moment, I can’t think of a better way to motivate companies and people to make the necessary big changes to their lifestyles.
 
"But for regular gamers -- those using smaller cases, or on laptops, or keeping hold of an ol' reliable 500W PSU -- they're becoming a major problem."

The people who aren't even thinking of buying a 4090Ti for their small case and good ol' 500W power supply. I drive a car to work every day, not a 18 wheeler.
This. It seems like many pretend the average user rocks a 5950X plus a 3090.

This can be irritating, e.g. when low / mid range GPU are exclusively tested with high end CPU.
 
Titan class cards (3090 and future equivalents), and flagship cards (3080 Ti and future equivalents) should never be constrained, or limited with power caps. They are suposed to be the biggest and the best possible. Laptops or consoles could never come even close to a proper desktop PC performance. They were never meant to. They are not even in the same class of product and comparing them is *****ic. PC is for the best performance no matter the cost (power, cooling, size). Laptops are meant for portability sacrificing on performance, power and cooling for size. Consoles are meant to be just good enough for most people sacrificing everything for the cost.
 
Such power draw is only possible for absolute top models. These days there are more and more eager customers to pay well over 1000 € (likely over 1500 €) for a GPU, since gaming is reaching larger and larger audience, which has resulted to the well known graphically extra beautiful AAA games that are nice to look at, but actually please only the casual gamers. People paying for the over priced GPUs right now support this claim.
I believe these extra expensive GPUs are not only for the enthusiasts who tweak hardware or play games very much, they are also for those rich casual gamers who want to marvel their games on a big OLED screen paired with the best GPU. More movie like the game the better (nay!).

I think the figures are exaggerated in these rumours, but I totally believe both GPU brands release even more expensive flagship models. This is just the nature of the business, there is no practical reason why products like these would be needed, since they offer just few percent more peformance compared to the more value oriented second best model, which of course is too expensive as well, but somewhat reasonable purchase for some.

We need to remember that AMD claims significantly better efficiency, which should result to less consuming models if we consider the usual 200 - 400 € price range and think how much performance will be handed out compared to the current models from a business perspective. The current high-end models already exceed performance of the consoles quite drastically, which is an additional point of view.
Lower consumption is likely for AMD, but I'm not sure if Nvidia can lower their consumption as significantly with their design, if the next architecture is anything like their current one, which is basically just more CUDA cores compared to previous architecture. Even if they manage to lower power consumption, this alone is not a way to create a competitive and financially successful product line. More innovative solution is required against RDNA3, just like AMD innovated and created the Infinity Cache.
 
At stock clocks maybe.
you overclock the thing kiss that good bye. you looking at 250 watt +

Sure if you're intentionally trying to waste power, but that's stupid. Have you tested a 1080?

Stock is 1733 MHz. My crap PNY 1080 runs at 160-175W out of the box at an OC of 1860 MHz once up to 74C temp as the cooler is trash. An actually good cooler would get 1886 MHz at 67C or similar.

Underclocked to 1733 MHz it uses the same 160-175W power as these cards overvolt (1.043v) on their own.

Undervolt (0.80v) at 1733 MHz and it uses 100-110W. Not 180W. I use this for lighter/older gaming.

Undervolt (0.95v) and overclock to 1923 MHz (with 11,600MHz OC memory) and it uses 130-145W. Notably below 250W. I use this for AAA gaming.

Undervolt (0.90V) and overclock to 1822 MHz (with 11,000 MHz OC memory) and it uses 115-130W. I use this for AAA gaming when the place is too hot to keep temps down.

More brainpower usage = less GPU power usage
 
Last edited:
Also doesn't help that it will push your electricity bill up by a fair bit if you game regularly, and will mean you have to spend money on a larger power supply where getting good 80 plus efficiency is important as your load at idle (I assume) would be well below max capacity.
If your power bill is a conern you should be buying $300 GPUs for vidya games, let alone $1000+ models. This has been true for quite some time.
 
OK. Let's put things into perspective.

350W 3090 vs 365W GTX590. So 590 had 2 GPUs on it, but you wouldn't play a single new game on it today. 24GB VRAM vs 3GB. 3090 does same amount of job as 4 1080Tis using ONLY 350W not 1kW+ and it doesn't choke at very high res renders. I have no issue with 1 card drawing 350W if it can do the work of multiple VGAs from previous generation.

500W 3090 that's ludicrous. Anyone who OC 3090 is insane.

As a curiosity. I have one 3D garden module which chokes 3090 to death. I thought that 1080Tis were weak, but it seems that amount of reflections, surfaces, leaves and all other organic matter is so enormous which is enough to bring even 3090 to its knees when doing live render in the Viewport.
Gtx 590 was a single gpu stepped up fuller fat fermi, I had the 590 classified replacing my 480s sc in sli. My 690 classified was a dual gpu though was using around 375 watts of power.
Fermi was a power hog too though.
Update the 590 was a dual fermi my bad. Mixed it up with the 580.
 
Last edited:
My 2060 is power limited to 180.

The 3090 I got to play around with seemed capped at 430, or at least that's the most I (and my brother whom currently uses it) saw it use.




 
Back