The Rise of Power: Are CPUs and GPUs Becoming Too Energy Hungry?

Simply put if you double performance and claim 50% better performance per watt you'll still use 33% more power than last gen. That's the trouble, they push the silicon to the max each gen. It's like the old story of no matter how much your salary increases you are never better of financially in a way as you tend you spend the extra income on better stuff as are just as broke as before.

Now I'm going to power limit whatever new cpu I get, be it 13700K or 7900X3D. I will now also download MSI afterburner and power limit my 2080 super to say 200W and see how it fairs.

Luckily I am mostly using my computer in the day and I have a huge solar system installed, but while fanboys and the media keeping hammering companies for not have the very highest fps in a game they will continue to set stupidly high power limits just for bragging rights. The 13900K is a joke in that regard, Intel was so desperate to claim the crown over AMD they just didn't give a a toss about power, same with Nvidia and the 4090.Luckily smarter people know how to improve this for very little loss of performance.
 
Last edited:
This article illustrates why I no longer overclock my GPUs. Even when I did I'd be lucky to see an extra 5-10fps with a massive increase in the heat generated. I would assume that would also mean a massive increase in the power draw due to it. So It's just not worth it IMHO, especially with the adaptive nature of modern GPUs.

I do still overclock my CPUs but I stay away from a lot of cores when selecting one for gaming. After 6 cores most games start to show more uplift from clock speed than core count. So again it's not worth having a 12 core CPU, unless you have other uses for your system. Most games even when CPU heavy are still bottlenecked more from the GPU than the CPU.

Would I reduce performance to reduce power consumption? No. I'd prefer using my part selection to help in that regard. Over provisioned top tier plat or gold PSUs to start. Water cooling the CPU and even the GPU if I can for the second. Then I actually turn the system off when I'm not using it. An extra 30-45 second boot doesn't hurt me any since my wage isn't centered on my PCs performance.
 
What's the power draw on this system? Not my picture so I can only speculate. PowerColor Devil 13 Dual Core R9 290X and the board appears to be an EVGA x299 Dark? Those are the power connectors nvidia should have gone with on the 4090s.

tuENlru.jpg
 
Great article; I always learn something new from these in-depth takes.
I gotta admit, though, as a PC builder and user, the only time I think at all about how much energy is being used is when I am selecting a power supply, and the only time I consider the thermal aspect is when working out the cooling solution (which CPU cooler, and case fan count and placement). Once my system is built, I do not consider at all how much power is being used or the monetary or environmental cost.
 
I always use vsync when heavy gaming. To save stress on the components, heat, noise and energy bill. 😇
But my monitor is 4K 144Hz 🔥😂
 
In the old times you would overclock and run videocard in SLI and CF. The top end cards should be compared to this.

If you are concerned about power usage, you can always buy a product from a lower tier.
 
Because electricity is so expensive in the UK at the minute I am (sadly) running my Ryzen 9 5900X in eco mode and my RTX 2070 Super with a 53% power limit.

Thing is I am not sure just how much energy that actually saves and it if is even worth the restrictions on the hardware. On a plus note though my already quiet PC is now just a faint whisper and I don't notice it being slower if I am honest.
 
If you are concerned about power usage, you can always buy a product from a lower tier.
As the article states, the lower end parts have also increased in power consumption because most of them are optimized for maximum performance. So for squeezing an additional few percents, a cpu draws maybe 30% more. sure, 30% more from 50w is not the same absolute value as 30% from 200w.
Maybe a switch in driver/bios for choosing the optimum performance / optimum power consumption. The arbitrary limits in bios like 65W (which translates in fact to 90W consumption) are not the same thing.
 
Because electricity is so expensive in the UK at the minute I am (sadly) running my Ryzen 9 5900X in eco mode and my RTX 2070 Super with a 53% power limit.

Thing is I am not sure just how much energy that actually saves and it if is even worth the restrictions on the hardware. On a plus note though my already quiet PC is now just a faint whisper and I don't notice it being slower if I am honest.

You could afford to buy 5900X and 2070 Super but I can't afford to run it? And you live in the UK like me, where average salary is multiple thousands of pounds every month?
What kind of hypocrisy is this?
Don't say you also bought a new BMW but not driving it because fuel prices went up.
 
Last edited:
You could afford to buy 5900X and 2070 Super but I can't afford to run it? And you live in the UK like me, where average salary is multiple thousands of pounds every month?
What kind of hypocrisy is this?
Don't say you also bought a new BMW but not driving it because fuel prices went up?
BTW how much per KWh are you paying now in UK?
just curious because here in East Europe is 0.24 GBP + VAT / KWh
 
BTW how much per KWh are you paying now in UK?
just curious because here in East Europe is 0.24 GBP + VAT / KWh

Very much the same at the moment. They will rise more in 2023 though. But I guess this is the case in your country as well.
 
Great article! I'm glad someone finally wrote it! CPU and GPU power loads now are just ridiculous. As we advance, power draw should be going down as efficiency increases. Everywhere else in tech, things improve yet for power consumption the draw is becoming crazy. We should be laughing and making jokes "Hey remember when we needed a 1200W PSU to power stuff?" Intel, Nvidia and AMD need to be called out for these things. I feel everyone is just ignoring this because "OMG MOR FPS!!" This is the one area that all the tech media and techtubers need to bring attention to. Power draw usage in the PC world is unacceptable. We need to get to a place where running the top end Intel/AMD CPU and a RTX 4090 only needs a 500W PSU and that would be the norm. Things need to improve and they need to be more efficient.
 
Great article! I'm glad someone finally wrote it! CPU and GPU power loads now are just ridiculous. As we advance, power draw should be going down as efficiency increases. Everywhere else in tech, things improve yet for power consumption the draw is becoming crazy. We should be laughing and making jokes "Hey remember when we needed a 1200W PSU to power stuff?" Intel, Nvidia and AMD need to be called out for these things. I feel everyone is just ignoring this because "OMG MOR FPS!!" This is the one area that all the tech media and techtubers need to bring attention to. Power draw usage in the PC world is unacceptable. We need to get to a place where running the top end Intel/AMD CPU and a RTX 4090 only needs a 500W PSU and that would be the norm. Things need to improve and they need to be more efficient.
I totally agree. They have to stop this trend or we'll end up with exotic cooling devices, crazy power bills, melted adapters, etc. The problem is for hardcore users only anyway... for now.
 
Because electricity is so expensive in the UK at the minute I am (sadly) running my Ryzen 9 5900X in eco mode and my RTX 2070 Super with a 53% power limit.

Thing is I am not sure just how much energy that actually saves and it if is even worth the restrictions on the hardware. On a plus note though my already quiet PC is now just a faint whisper and I don't notice it being slower if I am honest.

For my PC, just setting Windows to Power Saver power plan instead of High Performance or Ultimate Performance, makes a significant difference on my power bill.

I now run my PC on Power Saver by default, and only use Ultimate Performance when I'm running very demanding games like CP2077 or doing heavy multitasking (for example, reencoding a video and playing games at the same time, or doing lab tests running multiple VMs at the same time), in which case the higher performance power plans do make a difference.

To make things more practical I created two desktop and start menu shortcuts that instantly switch to Power Saver and Ultimate Performance power plans.
 
What are you smoking?

It is easy for X86 to tame power consumption

Even my desktop Windows XP machines top out at 23 Watts maximum

Just because Monopolies would rather sell 1200 Watt systems, doesn't mean we actually need them

I'd be happy with a "Modern" dualcore Sandy Bridge manufactured on a 3nm node, running @ 6Ghz (without turbo boost) and consuming 8-10 Watts of power

I don't need 16 cores @ lower frequencies and higher wasted power consumption

Uncorked ARM cores use just as much as x86,
X86 has lower power usage in some situations based on use case and compiler
ARM has lower power usage in some situations based on use case and compiler

Set them both to unlimited and ARM at the >>>>>>SAME FREQUENCY<<<<<<< as similar
 
A naive article by someone that does not understand the application of science (engineering). Every expenditure of work (energy) in the universe requires a "loss" or inefficiency; most often in the form of "heat" - its a law of the universe - the universe is expanding. CPU/GPUs have reduced losses only because the the switches (transistors) consuming the energy to switch (from 1 to 0) have got smaller and use less energy and therefore have less losses. So if we had kept to the same number of transistors (as say the 8088 chip) then our energy consumption would have gone way down. But we didn't because we could find a use for more and more transistors on a chip. The benefit of more transistors outweighed the increased overall energy consumption and loss. Up to each and every human to decide if using energy to play games or make fake currency was or is worth it. Certainly for industrial and business computing it has been worth it - computers with their relatively low energy consumption per unit of business production have replaced humans with their very high energy consumption per unit of business production - a net saving to the planet and the solar system.
 
Great article! I'm glad someone finally wrote it! CPU and GPU power loads now are just ridiculous. As we advance, power draw should be going down as efficiency increases. Everywhere else in tech, things improve yet for power consumption the draw is becoming crazy. We should be laughing and making jokes "Hey remember when we needed a 1200W PSU to power stuff?" Intel, Nvidia and AMD need to be called out for these things. I feel everyone is just ignoring this because "OMG MOR FPS!!" This is the one area that all the tech media and techtubers need to bring attention to. Power draw usage in the PC world is unacceptable. We need to get to a place where running the top end Intel/AMD CPU and a RTX 4090 only needs a 500W PSU and that would be the norm. Things need to improve and they need to be more efficient.
So you solution (like mine) is to tell people to stop or reduce their game playing time. Don't do the 2 hrs per day per 365 days. Get a job, raise a family , contribute to society instead of just sitting there playing games all of the time. If you need a break do some walking outside. Remember that the CPU/GPU energy is only a minor part of this waste of time - the human body consumption of food/water and air to sit for this amount of time is the major part!
 
I know the articles have different authors but please connect the dots here lol https://www.techspot.com/news/96438...mption-restrictions-could-limit-sales-8k.html
I really don't care about high end hardware but the irony of enthusiasts asking for slower/worse parts is always funny. It must be fun always worrying about everything. Just turn down the power limit yourself if it really keeps you up at night.
I think that if the authors understood this there would not have been an article at all, raising the question: if they were not writing this what would they be doing instead to justify their consumption of energy in terms of the food and water that powers them?
 
Uncorked ARM cores use just as much as x86,
X86 has lower power usage in some situations based on use case and compiler
ARM has lower power usage in some situations based on use case and compiler

Set them both to unlimited and ARM at the >>>>>>SAME FREQUENCY<<<<<<< as similar
Its about the number of transistors used at any point in time. Now if programs were written in a lower level language say machine code or even fortran/cobol there would be less transistors used and less energy consume/wasted. But sigh (just ask Steve Gibson - a huge proponent of efficient tight energy efficient programming) Steve lost the war with Microsoft and completely energy inefficient programing languages and programs are now our normal. Of course Bill was correct; the savings in humans energy consumption (food and water) using these simpler languages would, and does, more than offset this higher computing energy. So don't fuss about it.
 
In many ways all things are relative. The old days of computing power usage wasn't considered and power supplies drew a lot of juice even when the device was powered off.

An example from the article, the XsX uses 153W when gaming compared to the 62W on the Xbox One S. However the XsS uses only 74W. The original PS3 used 206W gaming and a WHOPPING 171W just sitting there idle on the navigation screen. (The Xbox 1/S/X is 27/28/48W.) The PS5 approx 200W gaming and 50W idle navigation.) Are we going up, down, or in circles?

Like most things in this world, it's complicated and based on the data I present and how I present it, I can support whatever conclusions I choose. For the length of this article, more nuance could have been presented to better explain the situation. Given laptop sales greatly outnumber desktop sales AND WILL CONTINUE TO DO SO, the overall PC trend is using less power, not more. The 2 hour every day of full power computing is unlikely for a gaming system as most people who can afford such hardware have other passions besides just PC gaming.
 
Very much the same at the moment. They will rise more in 2023 though. But I guess this is the case in your country as well.
Energy should always be quoted in US$ to make it comparable since the equipment and often the fuels are all priced in US$. The tax charged by say the UK reflects their opinion of the local labour (productivity) component that also makes up the cost. The UK is in a very bad space right now with respect to energy cost increase. The low hanging fruit of wind has already taken place and the next renewable (nuclear) is very expensive. They no longer control their fossil fuel costs - wold market cost only. Their technical labor productivity has dropped with the Brexit brain drain - not many lawyers or finance people can light a pilot light never mind repair a light switch. Running a power grid: out of their universe altogether.
 
Back