PC Cooling Science Project

Status
Not open for further replies.

vassil3427

Posts: 633   +0
Ok, This year I have to do a science project for school. I chose to do "The effects of heat on the processing power of a computer" or something like that, I'll think of a better name later... So I plan on running benchmarks on my pc under normal conditions with air cooling. Then I plan on running my system with some sort of much better cooling and re-benchmark....

Here are my options:
Dry Ice Cooling(Cheap, but dangerous)
Water Cooling(Wouldnt think it would make a drastic enough temperature change for application)
Phase Change?? (Not sure about that, have never looked into it..)
Vapor Cooling System(Seems too expensive, unless you can bvuild a homemade kit?)

Opinions are appreciated...:D
 
Well, in devices that have no moving parts, such as the CPU and RAM, provided it is running within spec, there should be no difference in performance whatsoever at any temperature. I don't really understand the point of your experiment? An athlon @ 2000mhz that operates at 65 C should perform just as well as an athlon @ 2000mhz that operates at 35 C.
 
Originally posted by vassil3427
Actually the colder an electronic component is, the faster it operates. That is a proven fact...


It is a bit different using devices that are driven by a clock generator. As long as the devices are able to operate normally at X speed, they will operate at exactly X speed no matter what. It is only when you reach extreme temperatures or unsupported timings that you would see performance deficits.
 
I guess I'm curious myself now. I think I will try this at home. I can get one of my tbreds to as low as 29 C and as high as 79 C, but that's about as far as I can go with what I have. I will definately try though.

I'm curious as to your results as well.
 
Actually this is kind of the point of my experiment..Too see if extra cold temperatures will make a computers components function faster..hehe
 
That would basicly mean that a system running with water-cooling would be more efficient in its work then say the same system with simple air-cooling ?

Please do post some facts/experiences. Wouldn't mind seeing what the outcome is.:)
 
Sorry, I'm with SH on this one. For systems that are clock driven, temperature has no effect unless there is a change in clock speed (e.g. due to thermal throttling). Obviously too high temp will kill your equipment, but low temps are fine. Are you sure you're not talking about the effect of temperature on overclocking potential of your PC? We all know that works.
 
Does anyone have any hard evidence to prove this theory Nic? After my experiment is over we will....(that is the point of the experiment, to see if colder temperatures increase cpu/system performance)
 
Originally posted by vassil3427
Well we'll just see what we see after my experiment is over with Mr. Harvester
You might want to listen to Soul, he has a good point.
If you keep pusing a device faster and faster, it will eventually become a short. Also, the colder you get a device, the less resistence it has(which is why it operates faster at colder temps)
Just ask technicians who work on aircraft in extreme climates if you don't believe us.
 
Ok guys,
I'm not overclocking anything, I am merely seeing if a decrease in temperature will increase the processing power of a CPU. Like you said, the colder it gets, the less resistence, so therefore perhaps better performance....And as I said, we shall see...this is something I shall prove or disprove...

Also I believe at lower temperature there is less "noise" allowing the electrons to move faster....
 
Well, I'm stil sticking with no difference whatsoever. It doesn't matter how cold you make the processor, it will still be operating at clock speed X, and will be producing results at clock speed X. It can't go faster or slower.
 
Well thats just it, thats what this experiment is all about....Too see if it makes a difference......here's my reserach paper....

Does Temperature affect the processing power of a computer’s CPU (Central Processing Unit)?

Thesis: In electronic components the colder they become, the less resistance there is to the electrons. However a CPU of a computer is clock driven, unlike the circuitry in a flashlight. Every modern PC has multiple system clocks. Each of these vibrates at a specific frequency, normally measured in MHz (megahertz, or millions of cycles per second). A clock or tick is the smallest unit of time in which processing happens, and is sometimes called a cycle; certain types of work can be done in one cycle while others require many. The ticking of these clocks is what drives the various circuits in the PC, and the faster they tick, the more performance you get from your machine. And so the decrease in temperature may no affect the CPU performance of a computer like it would a less advanced electronic circuit.

Hypothesis: I propose that a CPU will perform more efficiently and at a faster rate when cooled considerably below normal operating temperatures.

Materials: Project will require:
Computer System: (Already own)
Athlon XP 1800+ (1.53Ghz.)
384mb DDR 266
100GB Western Digital Hard Drive
Geforce 4 Ti4600 128mb
SoundBlaster Audigy
Fresh Installation of Windows XP Home Edition
Mini-Fridge ($50.00)
Copper Tubing ($15.00)
Water Pump ($20.00)
Copper CPU Block ($5.00)
Plastic Hose ($5.00)
Radiator (Already own)
Fan (Already own)
Anti-Freeze (Already own)

Methods: My project will require the benchmarking of the computer system under normal air-cooled conditions. And then it will be re-benchmarked under the liquid cooled temperatures. (Benchmarking is a series of tests evaluating the processing abilities). My findings will be recorded and then I will run the tests a second time to verify the data collected is accurate.
 
OK, if you aren't going to do anything but change temp, you will not see any increase in performance, maybe an increase in stability because it will not get as hot. However, once you get to a certain temp(well below 0) you'll start to see the decrease in resistence, you might then see a bit of an increase in performance, if so, it will be short lived, as the temp continues to drop, the device will begin to exhibit properties of superconductivity(meaning what was once a semiconductor is now becoming more of a conductor) and will start to short. If you can successfully create the conditions for this, it should prove a very interesting experiment for you.
 
My two-penneth worth on this one.

It is true that current flows more easily at lower temperatures, so a "normal" electronic circuit will operate faster the lower the temp drops.

However, as some have said above, a CPU is "clocked", so it's NOT a normal circuit, it is already artificially regulated.

What will happen with your experiment is unclear though. In theory as you load the processor it will get warmer, which will slow it down. Except the "throttling" of it is probably already slowing it's potential down to a speed below what it will encounter as it warms up. There will be a critical temperature that is reached where you would see it slow down, but your system is (hopefully) checked to ensure this (very high) temperature is never reached - i.e. your PC will turn itself off as a safety feature before your CPU fries its own brain.

So what everyone says is true - running a like for like system at two different temperatures, you won't see anything (unless you run so hot that one will fail). However as you cool the processor you get the ability to speed it up (overclock) WITHOUT getting to the critical temperature that causes it to fail. (Interestingly the same amount of energy is generated, you just get rid of it more efficiently).

On the flip side, I don't think this breaks your school project - you can still do it and prove that this is the case. And if you quote anything from the above, I won't even charge a royalty. ;)
 
Yes, kinda, within the next month I'll be getting everything set up and completing the experiment. I have to spend about $200 on alkl the stuff I'll need, that is why it hasnt been done yet. I will post results as soon as I finish....
 
Things to remember:

The theoretical superconductor works at 0° Kelvin or around -247°C

Your computer processor is not a superconductor though, you simply set it to a clock speed and it tries to compute at that speed. If it fails to do that it will abort (crash)

As the CPU gets hotter it will have more problems with computing what it is told to compute due to increased noise at higher temperatures. And at a critical point it will crash...

Now if you freeze this chip down to 0° Kelvin it will still operate exactly as fast as if it's temperature had been "normal working temp"

If you however also add in overclocking there will be a tremendous difference because the CPU will output more heat and fail sooner if it is hotter... I.e. you will gain more overclocking headroom the cooler your processor is...

This is all in line with what I explained above, what this makes us conclude is simply that any processor of a given speed can be overclocked, but by simply decreasing it's working temp we will not change anything...

Things start to change when we start to talk about stuff that don't have clocks, like our upcoming computer technology, but that is for a different lesson ;-)
 
Decreasing temperature decreases resistance, and decreasing it too much can cause things like semiconductors not to operate like they should.

Superconductivity can be achieved in temperatures much above 0K - I think the record is somewhere about 140K now.

I've read that some liquid nitrogen cooled computers have had problems because of aforementioned semiconductors, because superconductors are, well, super, perfect conductors, not semi.
 
Status
Not open for further replies.
Back