AMD predicts GPUs will reach 600-700W consumption by 2025

Daniel Sims

Posts: 1,877   +49
Staff
The big picture: Recent rumors about upcoming graphics cards have stoked fears of rising energy consumption. Nvidia and AMD haven't fully taken the shroud off the GPUs they expect to launch later this year, but the company did reveal some worrying numbers in its future roadmap this week.

This week, VentureBeat's interview with AMD senior vice president Sam Naffziger included visuals showing the company's predictions for near-future hardware. While Naffziger expresses confidence AMD can hit its ambitions for energy efficiency gains over the next few years, AMD's numbers paint a concerning picture of rising energy demands.

As engineers everywhere bump into the limits of Moore's Law, consumers worry that GPU TDPs could grow to alarming levels. Rumors have suggested Nvidia's next-generation flagship, the RTX 4090, which might launch late this year, could need as much as 600W. The company's current top card — the 3090 Ti — draws 450W.

If we don't see 600W cards in the upcoming generation, AMD's chart suggests we'll see them in the following or soon after. It shows the dramatic increase in power consumption around 2018, with GPUs hitting 600W and then 700W before 2025. However, the visual doesn't indicate whether it only accounts for top-end hardware. Currently, the most popular cards on the market consume a fraction of that energy, but even future mainstream GPUs will need more power.

Naffziger said AMD could get the performance gains consumers expect from new graphics cards while tackling the power consumption problem. The company has been pushing Infinity Cache as a unique advantage, but Naffziger also highlights chiplet design as an area where it leads Nvidia.

Permalink to story.

 
Nvidia leaks suggesting 900W GPUs. Now AMD saying we'll get 700W GPUs.
This is really bad news, especially when energy prices are increasing so much.


giphy.gif
 
Is that much power really needed though? at least for gaming? I have a 980ti and for some games it'll get sky high frames up to 4k but those titles are optimized insanely well.

Maybe instead of just releasing more power hungry cards to brute force their way through titles, nvidia, amd and game devs could work together to actually make their games run well.

we know its possible because the switch exist and companies make magic happen to get games running on that potato of a console.
 
So, AMD, go sell your hot potatoes to Google and I will subscribe to Stadia. Let the heat be with Google and not inside my case.
 
I'll be content with my 3080 for a long, long while, it seems. I don't want a GPU that will draw more power than my entire current PC does right now.

While it was a bit more up front than I really wanted to spend on a GPU, I opted for a 3080 for an upgrade with the consideration that upscaling tech should allow for a bit more longevity out of it.

We'll see how that goes, but it's not like lowering resolution and playing in windowed mode isn't an option either.

 
Nuclear Architecture...
needs Nuclear power planet
I don't know that I will decide on one of these power-hungry monsters in any of my future builds, however, I have both a 1000 W and 1200 W Seasonic power supplY in each of two of my builds that should suffice. I initially bought those because I was contemplating sli or some such, but never went that way.
 
The interview itself can be seen on YouTube:
Naffziger doesn't differentiate between consumer and server/datacentre GPUs when he shows the power consumption slide, but the last entry point on the graph (700W) almost certainly refers to their CNDA 2.0 products.
 
With that much power, you need what - a 1000 watt+ power supply? A traditional household bedroom is hooked to a 15 amp circuit which is safe for 1440 watts of continuous load. People need to start considering other appliances in the same room not to overload the circuit. No portable A/C, space heater, mini fridge, etc.
 
Is that much power really needed though? at least for gaming? I have a 980ti and for some games it'll get sky high frames up to 4k but those titles are optimized insanely well.

Maybe instead of just releasing more power hungry cards to brute force their way through titles, nvidia, amd and game devs could work together to actually make their games run well.

we know its possible because the switch exist and companies make magic happen to get games running on that potato of a console.


Yeah, these top-end GPUs continue t be the outliers, while the rest f the party keeps on going-on at 225w (3070)-or-less (3060, 3050). These are about the same power levels of the Turing predecessors, and will always be necessary in a Gaming Laptop world
 
When you can't get performance increases from architecture improvements, just throw more juice at it, eh? This will make 1kW PSUs the bare minimum going forward.
 
With that much power, you need what - a 1000 watt+ power supply? A traditional household bedroom is hooked to a 15 amp circuit which is safe for 1440 watts of continuous load. People need to start considering other appliances in the same room not to overload the circuit. No portable A/C, space heater, mini fridge, etc.
I put in a separate circuit for a 1500W space heater in my main computer room years ago. If I should decide to get one of these power-hungry beasts, I will probably be able to heat the room with the GPU only. ;)
 
So with so many governments around the world passing climate change policies and going after energy and every dam day in the news climate change is in the news and a company thinks it is okay to just increase power drew. And in Europe they are talking about rolling black outs in the fall because they have energy criss.

How wonderful is capitalism when you only have one or two companies with no incentive yes no incentive of bankruptcy or major drop in market shares.

Why should I get the 4090? I should keep the 3090 or buy used cheaper 3090 and just increase the clock and call it 4090.

This is just beyond horrible the lack of engineering going on here not making a new architecture, 3D stacking or nano technology but just really sloppy first year college student of just being lazy and increasing the power drew and clock rate.

The way GPU and CPU is going now days with every year only 10% to 15% gain only and more power draw and becoming a space heater I’m done yes done buying or building computers these days.

A computer in year 2030 will not be two or three times faster than a computer in year 2020.

Just look at the GPUs and CPUs these days it is beyond horrible what these companies are doing today.

I need a dam refrigerator and power plant for the engineering that these companies are bringing out and they think it is okay.

If the 4000s series GPUs are not ready than hold of on it for other two or three years to it is ready when we get down to 1nm when we can use the same power drew than drawing more power. Than bringing out a placebos GPU just because of greed.

And Intel is doing the same thing.
 
This applies to high-end stuff mostly. You're gonna have a larger scope of GPUs to choose from, probably.

I don't like the idea of a toy consuming that much power.

Thank Nvidia for pushing ray tracing which adds nothing meaningful while requiring expensive hardware to run.
 
I suppose we need an informed article

Can we get YOY GPU performance increases without additional power increases

I'm interested what the 7700 and 4060 will draw - because they are meant to be the bang for buck GPUs

I still think there are a lot of energy savings to be made for the coming 5 to 10 years
smaller nodes , smarter chips, smarter algorithms etc
The problem is gamers ultimately want hyper realistic games in 4k , long draw distance with full ray tracing - and the only way to do that its go big at the moment

Plus I don't want game developers to get lazy - just more power , just more memory to make up for lack of creativity
 
Even with 50% improvement of performance per watt - which is impressive - if they double performance gen-on-gen it' still means 33% more power usage at peak fps. If you start with 400W now in 2 generations you are already at 400*(1.33)**2 = 707W for RDNA 5. Fundamentally the entire architecture would need to be changed in ways probably not even imaginable currently for a GPU.

The good thing though is, we won't need the flagships going forward. How many fps do you need? Rasterisation isn't an issue anymore and with DLSS, FSR, XeSS it becomes even less of an issue. 4060 and 7600 will be 4K cards, and 4070/7700 will be probably 4K 120fps viable even with RT. How many need 4K 240fps?

I know I won't be looking beyond 4070 Ti or 7700 for my next card and more than likely it will be RDNA3. 7900XT and 4090 will be total overkill.
 
Is that much power really needed though? at least for gaming? I have a 980ti and for some games it'll get sky high frames up to 4k but those titles are optimized insanely well.

Maybe instead of just releasing more power hungry cards to brute force their way through titles, nvidia, amd and game devs could work together to actually make their games run well.

we know its possible because the switch exist and companies make magic happen to get games running on that potato of a console.

Just look at Doom & Doom Eternal. Great visuals, insane speed, extreme optimization and well thought out. At least they keep the heritage of Carmak and Abrash. the other developers sometimes seem lazy.
 
Back