The Science of Keeping It Cool

An excellent article which started me thinking after reading these two quotes:

"We can't make CPUs and GPUs much bigger because there is no good way to cool something that powerful. " ""There's no way to actually get rid of heat, so all we can do is move it somewhere that it won't be an issue."

It seems to me that much of the problem the world is facing today comes from that attitude. Many processes create heat which is then 'vented 'to the atmosphere, raising the ambient temperature of the world itself and contributing to the various problems that causes.

If we would change our basic mode of thinking, rather than just rolling down a window and throwing stuff out, perhaps heat problems could be solved by a 'direct at base of flames' solution.

In the case of CPU coolers for computers, if the heat from the CPU was captured by a cooler and converted directly into electricity, it could then be routed back into the mainboard's circuitry.

That is essentially the concept that is used with modern home heating furnaces. There is very little heat which escapes from the exhaust of such a furnace.
 
Both my Core i9Ex and my 2080Ti have liquid AIO closed loop coolers.

Even after 2 hours of Call of Duty I never see temperatures rise past 52 Degrees in any sector of my computer.

If you buy any GPU, you should get one with a closed loop AIO - and since you can easily buy a AIO for your CPU, I totally suggest getting one.

Every computer I build includes closed loop AIO and I've never gotten a complaint.
 

Attachments

  • images.jpg
    images.jpg
    13.5 KB · Views: 0
  • FTW3 2080Ti.jpg
    FTW3 2080Ti.jpg
    6.5 KB · Views: 0
  • 82270499_2263820007261214_3755453505939177472_n.jpg
    82270499_2263820007261214_3755453505939177472_n.jpg
    99.2 KB · Views: 0
Great article. An interesting follow up would be the evolution of consumer pc HSF. From fin arrays to game changers like Zalman flower cooler to Cooler Master Hyper 212 to Noctua D14!
 
Good article~

Both my Core i9Ex and my 2080Ti have liquid AIO closed loop coolers.
Even after 2 hours of Call of Duty I never see temperatures rise past 52 Degrees in any sector of my computer.

If you buy any GPU, you should get one with a closed loop AIO - and since you can easily buy a AIO for your CPU, I totally suggest getting one.

Every computer I build includes closed loop AIO and I've never gotten a complaint.

I agree, I haven't used air-cooling on a CPU in over five years, AIO's are better than insurance. Individual components for 'making' a water cooler are affordable and easy to work with almost anyone can configure a DIY solution to be more reliable and effective than a box of fans.
 
Excellent Informative Article.....! It would have been awesome if William would reflect on some of the challenges of cooling a Quantum Computer...!
 
Good article~



I agree, I haven't used air-cooling on a CPU in over five years, AIO's are better than insurance. Individual components for 'making' a water cooler are affordable and easy to work with almost anyone can configure a DIY solution to be more reliable and effective than a box of fans.

Thanks but no thanks. I'll be stick to my noctua d15 and have a silent pc.
 
Both my Core i9Ex and my 2080Ti have liquid AIO closed loop coolers.

Even after 2 hours of Call of Duty I never see temperatures rise past 52 Degrees in any sector of my computer.

If you buy any GPU, you should get one with a closed loop AIO - and since you can easily buy a AIO for your CPU, I totally suggest getting one.

Every computer I build includes closed loop AIO and I've never gotten a complaint.

Guess what, some people are building on a budget and AIOs are pretty terrible when it comes to price-to-perf ratio. I'd gladly take a $20 cooler and 70C over a $120 AIO and 50C. Modern CPUs are engineered to operate at 100C so I highly doubt that extra 20C difference is going to result in any meaningful differences in lifespan
 
Guess what, some people are building on a budget and AIOs are pretty terrible when it comes to price-to-perf ratio. I'd gladly take a $20 cooler and 70C over a $120 AIO and 50C. Modern CPUs are engineered to operate at 100C so I highly doubt that extra 20C difference is going to result in any meaningful differences in lifespan


I disagree with that entirely. A single fan AIO is around $89 or less and a dual fan is around $120 or less.

If you're talking about someone who is "building on budget" yet thinks that they should have a top end CPU, a mid-level GPU and somehow see the same performance as people not building on budget then someone's fooling themselves.
 

Attachments

  • EVGA INTEL Core i9 9900k (2).jpg
    EVGA INTEL Core i9 9900k (2).jpg
    157.1 KB · Views: 1
  • EVGA INTEL Core i9 9900k (4).jpg
    EVGA INTEL Core i9 9900k (4).jpg
    173.5 KB · Views: 1
I agree, I haven't used air-cooling on a CPU in over five years, AIO's are better than insurance. Individual components for 'making' a water cooler are affordable and easy to work with almost anyone can configure a DIY solution to be more reliable and effective than a box of fans.

You should really read more reviews of air cooling vs AIO. It's pretty clear they actually out-perform AIO's in most cases (the Noctua fans in particular). I'm on the fence right now about what to get to cool my i5 4670K @ 4.4GHz. I don't really need to as the temps are good enough with the stock cooler, but I'd like to try and push it a little further just for kicks. Pushing the clock further minimally increases performance from my research. I've reseached the hell out of everything and some of the cheaper AIO's from Deepcool and Cooler Master are pretty tempting mainly for aesthetic reasons.

The biggest difference to me and what's holding me back is the possibility of a leak and/or pump failure. I don't care so much if I fry the CPU, but I'd like to use the cooler on my next build too. Air coolers can last forever so long as the fan doesn't quit.
 
I'm still stuck on "these gates are microscopic, there are billions of them in modern chips and they are switching billions of times a second". How is that even possible, and how do you control it to make it usable?
 
I'm still stuck on "these gates are microscopic, there are billions of them in modern chips and they are switching billions of times a second". How is that even possible, and how do you control it to make it usable?

Your PC today runs probably around 2-3 Ghz (mine is 2.9)- that's two or three billion operations per second. By making the transistors smaller you can make them faster and use less power. But eventually you 'Use up your get out of jail free cards' as the article explains and you can't go any faster or make them any smaller. It's why we have 8 core processors now - we can't make one that goes 8Ghz. The silicon would melt. To get more performance we have added cores - so we have 4 cores going at 3Gz instead of one core trying to go at 12 Ghz.

But can you make them smaller? Not really - I think they're already putting the transistors like 8 or 12 nano-meters apart. that's REALLY close together, and it's really hard to do that and have the chip work correctly afterward.

Now to make CPUs faster, we add cores, make things more efficient, add cache, etc. Someday maybe we'll have synthetic diamond CPUs, but I hear more about quantum computing than diamond CPUs.
 
If we would change our basic mode of thinking, rather than just rolling down a window and throwing stuff out, perhaps heat problems could be solved by a 'direct at base of flames' solution.

That's exactly what I'm doing. It's winter time now, so whenever I get cold, I start playing demanding games, which makes the air warmer. Next step is adding a tube that would capture all the hot air from GPU and CPU and send it towads my feet. Totally ecological solution. The more I play, the warmer my feet.
 
As a retired power supply design engineer I think I coined the term "heat spreader" around 1979. I was designing encapsulated (epoxy potted) power modules and started using the term to describe my heat sink solution. The heat spreaders were used in production.

Later, I invented a thermal interface that was many times better than thermal compound.
 
An excellent article which started me thinking after reading these two quotes:

"We can't make CPUs and GPUs much bigger because there is no good way to cool something that powerful. " ""There's no way to actually get rid of heat, so all we can do is move it somewhere that it won't be an issue."

It seems to me that much of the problem the world is facing today comes from that attitude. Many processes create heat which is then 'vented 'to the atmosphere, raising the ambient temperature of the world itself and contributing to the various problems that causes.

If we would change our basic mode of thinking, rather than just rolling down a window and throwing stuff out, perhaps heat problems could be solved by a 'direct at base of flames' solution.

In the case of CPU coolers for computers, if the heat from the CPU was captured by a cooler and converted directly into electricity, it could then be routed back into the mainboard's circuitry.

That is essentially the concept that is used with modern home heating furnaces. There is very little heat which escapes from the exhaust of such a furnace.
But then we ll be forced to keep producing heat because we need that heat for heating our houses. So might not be the right way to go for the future.
 
Something struck me about the comment in the article about diamond pots being the holy grail. There is commercially available diamond deposition equipment that, properly configured, deposit 60 microns per hour and the statement about not being able to make CPUs bigger because of heat transfer.

I'm wondering if the limitation is not on heat transfer but cost benefit of heat transfer in the declining size of circuitry evolution. There was no real need to move to more expensive methods when size/power draw were evolving.

A second thought would be the CPU cover conversion to a 1 millimeter thick (about 16.5 hours deposition) and making the CPU cover 4 times the size of the CPU, thus quadrupling the heat transfer cross section with what would be, "perfect transfer". If this is a full cross-section engineering issue and not single-point averaging, a diamond heat transfer cover should move the top range up a bit. If the actual problem is xray etching still leaving uneven circuitry, and thus uneven heat generation points that can't be individuall compensated by a single large "perfect transfer" shield then no, diamond wouldn't be better.
 
Last edited:
Put the PC in a refrigerator or cooler,
some are 200$ cheap.
Did anyone try this ?
It's been done with liquid nitrogen . The cost exceeds anything but Guiness Record Book goals. If you check your 200 buck refrigerator, you'll find it cost so many dollars per year ON AVERAGE to run. That is based on the door closed and cooling and keeping room temperature or lower objects cooled. If you put a space heater in the refrigerator or freezer compartment, and then let it run, the cost per year gets excessive very fast. A PC is a lovely space heater, especially if you're a gamer.
 
But then we ll be forced to keep producing heat because we need that heat for heating our houses. So might not be the right way to go for the future.
Yes, we need heat INSIDE our houses but we don't need to pump hot air out of them to remain cool. The point is the need to rethink all unnecessary creation and release of heat to the atmosphere. It isn't just warmer days in summer. Many biological processes have been altered already (mold and fungi proliferation) and more are threatening our health and well-being.

As for human bodies producing heat, that's true. Maybe the planet has too many inhabitants to support. That may be the message Gaia is sending us.
 
Last edited:
Yes, we need heat INSIDE our houses but we don't need to pump hot air out of them to remain cool. The point is the need to rethink all unnecessary creation and release of heat to the atmosphere. It isn't just warmer days in summer. Many biological processes have been altered already (mold and fungi proliferation) and more are threatening our health and well-being.

As for human bodies producing heat, that's true. Maybe the planet has too many inhabitants to support. That may be the message Gaia is sending us.

You mean like this >

https://www.nature.com/articles/d41586-019-03911-8

https://www.sciencealert.com/scient...an-also-beam-heat-into-the-cold-void-of-space

https://www.asme.org/topics-resources/content/new-solar-energy-device-beams-heat-into-space

It's actually irrelevant

The position of our Solar System in relation to the Galactic Core causes Cycles of heating and cooling over many thousand years, creating Global Warming, followed by Ice Ages, followed by Global warming, followed by Ice ages... etc etc etc

Man made Global warming will simply shift the Natural Cycle to the point where Massive Global Death comes earlier than it normally would, causing overheating / Super Storms and enough water vapor to cause a premature SuperCooling Cycle, causing another massive cycle of death and extinctions

ENJOY!
 
Last edited:
I'm still stuck on "these gates are microscopic, there are billions of them in modern chips and they are switching billions of times a second". How is that even possible, and how do you control it to make it usable?

In simplest terms: very specialized tools.

Scientists *think* they can do 5nm, although quantum tunneling starts to become a major problem; chip design is literally starting to run against the laws of physics. This is kind of why Intel's 10nm node has run into so many problems; they ran into unexpected problems (high defect rate) which has prevented them from continuing to to simply clock up their CPUs. That's a large factor why AMD is doing so well right now, since Intel doesn't have a functioning 10nm node. [Meanwhile, TSMC is at 7nm and starting at-risk 5nm production].
 
This kind of articles makes me wonder if I shouldn't pay for a subscription to the website. Such great reading, even for an primary spanish speaker. Many thanks!
 
Back