The Rise of Power: Are CPUs and GPUs Becoming Too Energy Hungry?

Meh, interesting article, but my or your "gaming pc", needs to quit being the target of energy hit pieces like this one.

I work with a big tech company, the amount of power these companies hoover up makes a home pc or game console not even a drop in the bucket, honestly they dont even exist compared to what my job absorbs.

make an article that dives into whats needed to create the processors, that would floor some people.
 
What a great unbiased and in depth article. Thanks for this.

Given all of the comments to this article already, it is clear that this topic is a very divisive topic. It is very complicated with numerous parameters and trade offs, which I think the author did a great job covering.

For me, there a few key items in this article that seem extra valuable. In retrospect, my comment comes out super preachy, sorry about that in advanced. I will go yell at some clouds now.

1) Performance / Watt is really what matters (performance = FPS, score, etc...). Just looking at power usage does not really tell you the benefit that you get form the latest and greatest CPU or GPU. Even though power levels are slightly or may be even modestly increasing generation over generation (5-20%), you are typically getting a performance improvement that is significantly higher (30-50%). That is pretty impressive if you ask me. Physics is a limiting factor here, not that these companies want to push power through the roof. The 30% more power for 5% more performance is a marketing thing for the top of the line components. It is a very competitive market, and bragging rights matter when it comes to sales. You as a consumer need to make the personal choice on how much performance you want and what price you are willing to pay ($$$, power, heat).

2) All of the fuss about how much power these systems use is really not that big of a deal. Even for high end systems the yearly cost is not that much compared to the cost of your computer. Especially so if you are willing to drop $1500-$2000 on just a GPU, then the cost to run that computer at a worst case of $0.50 a day is insignificant. That is like buying a Ferrari and complaining that it only get 10 mpg and it needs premium fuel. You can't have your cake and eat it too. For reference, this time of year my office is pretty cold in the morning. I use a run of the mill 1500W portable heater to warm it up. It usually takes about an hour and I really don't think anything of it. Did you know that adjusting your home thermostat by one degree can change your heating/cooling bill by 10%? How many times did you want it just a bit cooler or warmer in your house and you nudged 1 or 2 degrees and never once batted an eye at the cost difference.

3) You the consumer get to choose what is important to you. You don't have to buy these more powerful and power hungry components.
 
Current device power hungry? Neah.........Just look at Geforce 265 compared to 4090.
3famm743riu91.jpg
My old Palit 8500 GT looks a lot like that (except that the PCB is green and it's not AGP). :laughing:
s-l500.jpg

I do have a card that I keep for diagnostic purposes, an XFX One HD 5450. It's rated at only 19W and is passively-cooled!
29887-md.jpg

🤣
 
Last edited:
Few month ago I just realized that when I game, I could limit the game to 60fps and have less CPU and GPU usage which means less heat and less energy. my wattmeter confirms it and now I'm putting all my games on 60fps lock. didn't realize just a single option in the game could save some energy.
 
I *just* ran into this issue. Linux has had a remarkable improvement in the Intel graphics drivers the last year or two (I have a laptop with 11th/12th gen Tiger Lake GPU and it's great even for gaming.) But drivers can't work miracles, I want to game on my desktop and Ivy Bridge GPU is simply too old. No extra power connectors, I *thought* this would be no problem.

Problem. I picked up a GTX1650 4GB GDDR5 (used for $100, thanks gamers!) but that's literally the only vaguely gaming-capable card on the market in a 75W power budget, AND it's being phased out in favor of a GDDR6 model (that requires the 6-pin connector.) I kind of doubt the Ivy Bridge can keep the GTX1650 fully fed anyway (in real games*, I assume Furmark would max it out) and was going to look at older cards -- nope! The somewhat older AMD cards (that reviewers recommended over GTX1650) -- well over 75W. Then it goes straight from that to those $50 cards, but they're only good for adding additional heads to your system -- they're using like GeForce2s and such still!

I assume this will come to a head eventually -- not for environmental reasons or whatever, but simply because, OK, you've gotten cards that could do photorealistic rendering at 1920x1080. So then people needed 4K rendering. Then 120 and 240fps monitors came out so you had to have 240fps instead of 60fps. So you now have cards that do all that. I've read gamers are not going for 8K, but you do have raytracing now extending this out a bit further. But basically I think it's very close to hitting a head where both the gamers and game programmers will run out of new features to consume GPU resources, at which point inevitably the technology will be applied to reducing power consumption of a similar-sized GPU rather than adding even more transistors to it.

*So far I've only gotten it up to 30% load and 20W power consumption.. versus 8W idle... according to nvidia-smi, not even enough to get the fans above the "26%" base speed. But I only put it in 2 days ago so I've only tried a couple games so far. Maybe I'll find some CUDA (Compute) workloads to give it some exercise.
 
Personally, I chose to stick to 1440p. I think it is high enough, and the gain in 4k is not justified by the requirements in hardware and power IMHO. A high quality 1440p monitor and a 3070 Ti do the job for me, and while gaming with default parameters, my entire rig (main unit, monitor and peripherals) draws more than 500 watts *at peak*, and that's already a lot! I live in the south of France, where it gets quite hot in summer, and having a rig that dumps a large amount of heat in a room that is already hot, well... you get it! And I don't want to turn on the A/C to lower the temperature created by a PC, as that would really be stupid/nonsense.
 
Last edited:
Grammar police here: "begs the question" does NOT mean "raises the question". In fact, almost the opposite! So use "raises" if that's what you mean. If people keep misusing "begs" English will lose its unique meaning, out of ignorance.

Also btw "incredible" not "incredulous". And "you can afford" not "money can afford you".

It's a good article that goes deeper into the topic than most. But it stops short of the costs of room cooling, suggesting only an open window (!)

Spending $1/day for a PC to heat your room can, depending on the climate, cost you well over that by the time you get that heat out of your house. This is never addressed, partly because the only available info on A/C efficiency is notoriously hard to apply. I'd like to see that done by someone who knows what they're talking about. That would be a first.
 
Personally, I chose to stick to 1440p. I think it is high enough, and the gain in 4k is not justified by the requirements in hardware and power IMHO. A high quality 1440p monitor and a 3070 Ti do the job for me, an while gaming with default parameters, my entire rig (main unit, monitor and peripherals) draws more than 500 watts *at peak*, and that's already a lot! I live in the south of France, where it gets quite hot in summer, and having a rig that dumps a large amount of heat in a room that is already hot, well... you get it! And I don't want to turn on the A/C to lower the temperature created by a PC, as that would really be stupid/nonsense.
My rig draws 125w GPU(limit from 175) + 65W CPU + 60W LCD + ~50 rest of the system totals 250-300W and still raises room temp in the summer.
 
What about efficiency of the power supplies?

A bronze ATX power supply draws 45 watt at the wall to run my 35 Watt Intel CPU
A pico power supply runs the same computer using 23 watts

What happens during a disaster and power outage?

A single 12 Volt 18AH Lifepo4 battery will power my laptop for 30 minutes using a Baseus 65 Watt PD car charger
A single 18AH Lifepo4 battery will power the same laptop for 7 hours using a pure sine inverter
BUT, 2 of those 18AH batteries in series will power that laptop for 28 hours using the same baseus PD charger

DC/DC USB-C PD chargers double the runtime of my laptop by simply eliminating the inverter

They do not work very well on 12 volt systems but work wonders on 24 volt systems

Efficiency must be maximized at every stage of the game / not just CPU and GPU




 
Thank you for the article. I was waiting/wanting for something like this. Maybe the next step would be to have some (arbitrary) power categories for cpu(65,90,125 unrestricted), gpu (75,150 you name it), and a whole system power envelope and maybe run some limited tests of gaming, browsing, video playback, encoding. Maybe this is the way to convince people and maybe even the industry to default to the optimum power/perf. I do not like gaming near a hot pc in the summer and do not enjoy the fan noise in any weather. at some point the case was for protecting the inside of the pc from dust and offer a bit of noise insulation. at this point, the pc cases have become scaffolds for fans, I wonder why we even bother with the case and not use open systems.
 
Chip makers aren't to blame. They have the need to offer better, faster product each year, but they dont have much room to go further with what they have right now.
They do what they can, adding more power and making more power hungry chips.
I liked all of my Samsung phones. But I was unpleasantly surprised how quickly s22 ultra loses power. It has a huge battery and it barely last over one day for me.
 
This has been happening since they introduced fully-programmable GPUs on both sides : power consumption of top-end cards doubled in just a generation

then, thanks to slower die shrinks, the architectures just started getting wider, which meant higher power consumption:

https://tpucdn.com/review/amd-r9-290x/images/power_average.gif


After that, aside from peak new cards (see factory-overclocked cards like 3090, 4090, 6900 xt), power consumption peaked at 250w
 
The bottom line: WE ARE REACHING THE LIMITS OF LITHOGRAPHY AND SILICON.
I was about to write the same thing here. They are already looking for a subsitute to silicium, as this will no longer be suited to extreme lithography thinness. (BTW, "silicon", you sure? ;-) )
 
Last edited:
The bottom line: WE ARE REACHING THE LIMITS OF LITHOGRAPHY AND SILICON.

And consumerism driven by crap we don't need can't stop (like 4k+ resolutions, ray tracing and triple-digit fps, dozens of CPU cores and dozens of gigabytes of ram on MOBILE DEVICES, cars that are becoming ugly and unmanageable tech gadgets on wheels). Otherwise this wouldn't be so much of an issue.
 
I like powerful PCs as much as anyone reading this, and I admit I am typing this on a power-hungry X86 gaming desktop, albeit midrange, but:
The hard truth is that the planet cannot sustain fossil fuel power generation, power consumption and heat dissipation at current rates regardless of what is consuming the power and dissipating the heat. This is not "pearl clutching" but an inconvenient truth. It may have to come to regulators imposing further restrictions on the efficiency of electronics, just as they do on autos. If X86 cannot tame its power consumption it will have to give way to a different architecture, both client and server. If the solution to PC power consumption can be market driven leadership is more likely to come from the server side than the client side, because business is likely to pay more heed to total cost of ownership than individual consumers. Gaming GPUs are a luxury item, regardless of cost: a day of reckoning is coming.
 
Great article - say no to 8K - don't need it for gaming or immersion- sound, light and story far more important.

We just have to get smarter .
These eco features need to be in peoples faces - ie GPU should tell you with a click of button after a sample test .

PS6 and Xbox2 will be help a lot - this is an everyday appliance - they will have starve off stupidity - as Developers target them .
Mobile gamers also a boon

Maybe more AI that is backwards compatible for all games - download profiles off servers .

4090 are just a minority - most of us don't want one - waste of money and power - change a few settings and a mid-range card will give you a silky looking great game - so what it 1440p and not 4k ( add in upscaler ) - half the cost , half the power - 98% identical game play .

I mean really except for raytracing ( light ) , draw distance would in fast action games notices the diff - in open world rpg 60 fps is enough
 
Right out of tech college, I worked for a year in Houston in 1980 at the TI plant that made DFS (digital field systems) equipment for the oil/gold/gas industry. The reel to reel tape drive had a power supply that would regulate 12 volts off a truck battery at 20amp to 0.006 of one volt.
The thing was full of those old school TO3 transistors on the back, mounted on a HUGE aluminum heat sink. First thing we had to do when testing them off the assembly line was to measure the resistance between battery ground and logic ground. If it was below 10 meg ohms, we had to disassemble the board, take it over to the bath and wash any flux from the wave solder bath, reassemble it and start over.
It was a switching clock regulator power supply and was the size of a 17" laptop. Now, you can buy a 1000W power supply for your megacomputer that fits in the palm of your hand and only weights 10 pounds. Amazing how far things have come in 40+ years.
 
"Depending on where you live in the world, and what rates you're paying for your electricity, the use of a PC like this might be anywhere from $70 to $280 (taxes and additional charges not included), each year."

I live in San Diego, it's closer to that $280 mark. We pay more than anywhere else in the country. Sadly it's a monopoly too, so we can't do much about it other than get solar... and still get a bit screwed on that deal too.
 
Well the effective limit is 1200 sustained watts .. after that people will find out thier sketchy 120v outlets will cause nothing but grief
 
If X86 cannot tame its power consumption it will have to give way to a different architecture
What are you smoking?

It is easy for X86 to tame power consumption

Even my desktop Windows XP machines top out at 23 Watts maximum

Just because Monopolies would rather sell 1200 Watt systems, doesn't mean we actually need them

I'd be happy with a "Modern" dualcore Sandy Bridge manufactured on a 3nm node, running @ 6Ghz (without turbo boost) and consuming 8-10 Watts of power

I don't need 16 cores @ lower frequencies and higher wasted power consumption
 
Back