The Rise of Power: Are CPUs and GPUs Becoming Too Energy Hungry?

It "begs *FOR* the question"
No. To "beg the question" is an elegant three-word phrase that means to "assume or pretend that the question has already been raised and satisfactorily resolved (generally in the affirmative)".

If we keep misusing "begs the question" when we mean "raises the question", we will lose the real meaning and our language will be poorer.
 
We audiophiles say 'welcome', to the energy & heat dilemma.

My Krell mono-blocks (Class-A) laugh at your 450W GPUs. Each requires a dedicated 20A 230v connection to the board.

In fact, great to have on in winter, requires windows to be open in summer, and only now that energy prices have risen so much, that it has it occurred to me to maybe use them less.

But I just can't...
 
Thanks to the article, now I'm limiting the power usage of my 3070ti to 70%, my 3600x always run around fixed 4ghz and probably I'll change my window power plan as well. Really nice article.
 
No. To "beg the question" is an elegant three-word phrase that means to "assume or pretend that the question has already been raised and satisfactorily resolved (generally in the affirmative)".

If we keep misusing "begs the question" when we mean "raises the question", we will lose the real meaning and our language will be poorer.
Ooops, my bad, transitive verb, no need of "for"
 
Many homes in the US only have 15 amp circuits, especially in bedrooms and offices where PCs are commonly located. Assuming you're getting 110-120 volts, 15 amps gives you a theoretical max load of on that circuit of 1650-1800 watts. With PCs today already pushing past 1000 watts you could be getting close to maxing out a 15 amp circuit once you start adding monitors, peripherals, lights, etc. As the power draw of PC components continues to increase we may eventually get to the point where many people won't be able to power a high end PC and all the peripherals that go with it on their household electricity.
 
FYI capping frame rate doesn't worsen your latency if anything it should keep it consistently the same with even less potential of throttling. If your gpu is capable of rendering a frame at 5 milliseconds capping your frames doesn't change that. For example in Vermitide 2 at 4k 120hz C10 oled capped frames at 120 fps with vsync and vrr on I get an average of 4.5 milliseconds if I uncap the frame rate the latency is the same at 4.5 milliseconds frames go to double around 240 frames the power goes from 250 watts to over 400 watts utilization and the frame variance is all over the place due to micro throttling.
 
Last edited:
It just means that there will be more real separation between x/k skus and their non x/k counterparts. Both AMD and intel are continuing to make low power versions, the high power parts come out first but low power skus are inbound for both.
 
At this rate, in another generation or two, a 30 amp breaker (or at least a dedicated 15 amp breaker) will be required to run synthetic benchmarks on 'high-end' gaming PCs.
 
I think that the cause of this problem is obvious: manufacturers feel that if they can't advertise their product as faster, more powerful, than the competition's, people won't buy it, and so they have to advertise the performance for a high power setting. I can remember when microprocessors didn't even need fans - and, of course, the processors in smartphones still don't.
So it's definitely possible to build a computer capable of doing a lot of things that would use so little power that it doesn't even need a fan. But people would say that computers were that powerful years ago, so they wouldn't have a reason to run out and buy a new computer. And the most popular computer games try to provide a level of graphics realism not seen until now, which helps to sell new computers.
So it's not going to be easy to change something that the industry is apparently dependent upon for its cashflow!
 
Great article.

These high powered computers are basically used as entertainment.

In the days before computers, for fun we would work on our cars, cruise to the mall, or drive a couple of hours to the beach. On Saturday night we would cruise Main Street in our gas hogs.

So what's more energy efficient? Driving a fast car or a fast computer?
 
We invest in technology to solve problems.

One technology is better than another of the same kind, if one solves more problems than the other. Either that, or if one technology does not introduce more problems like the other technology of the same kind.

Some problems are given more priority than others.

Two similar technologies to compare for example:

i9 13900K and R9 7950X

Number of problems solved by each tech.
Number of problems introduced by each tech.
High priority problems tackled, by percentage. Such as power consumption and or cost of product.

This here, is a real-world benchmark.



 
Back