Intel is designing new hardware with immersion cooling in mind

mongeese

Posts: 611   +122
Staff member
Forward-looking: Intel has announced a partnership with Green Revolution Cooling (GRC) to develop sustainable immersion cooling for data centers. The first fruits of their partnership are findings on the usefulness of immersion cooling, described within a newly published whitepaper.

According to two estimates from 2020, data centers consume anywhere from 1.5% to 2% of the world's energy and could be consuming as much as 13% within ten years. Around half of that energy is used by the computers themselves and 25% to 40% is used by air conditioning, says the US Department of Energy.

Some data centers have made strides to improve their cooling efficiency lately but they've been negated by the rising power consumption of new hardware. According to Statista, the average power usage effectiveness, I.e. efficiency, of all large data centers has held at about 1.6 for about a decade.

In their white paper, Intel and GRC say that immersion cooling cuts out the need for server fans, which make up 10-15% of a server's power consumption. Immersion cooling can also cycle heat away faster than air cooling resulting in more efficiency gains, but the paper didn't put a number to them.

Intel and GRC express the most interest in single-phase immersion cooling, as opposed to two-phase cooling. The former uses a pump to circulate a non-conductive liquid around a tank containing multiple servers and relies on a heat interchanger to cool the liquid. It's simpler than two-phase cooling, which involves the liquid boiling into a gas before being cooled back into a liquid.

"Intel is designing silicon with immersion cooling in mind, rethinking elements like the heat sink."

Immersion cooling also has other benefits over air cooling, according to the white paper. Data centers collectively use billions of gallons of water each year for their cooling and power generation, which immersion cooling would significantly reduce. Immersion-cooled centers can also be built smaller than air-cooled centers, reducing land waste and building costs.

It does have its flaws, though. Having all your systems be submerged would be a maintenance nightmare, and also make errors more severe. However, Intel seems pretty willing to bet on it.

In May, the company announced that it was building a $700 million research lab in Oregon with a focus on sustainability initiatives, including immersion cooling, heat recapture, and water usage effectiveness. It's being joined by other companies, including Microsoft, in experimenting with immersion cooling and other strange approaches to cooling as data centers become larger and the need for sustainable solutions grows more urgent.

Permalink to story.

 

psycros

Posts: 4,333   +6,322
Its honestly hard to believe that data centers use that much power. If its true then the US is using much less power for everything else than I thought.
 

p51d007

Posts: 3,295   +2,886
H2O isn't conductive, it's all the minerals and what not that make it conductive.
I remember there was a commercial back in the 80's for something where they took and immersed a
working television in this liquid and it kept working. Forgot what the commercial was for, but it is possible as long as the liquid is not conductive.
 

seeprime

Posts: 681   +893
H2O isn't conductive, it's all the minerals and what not that make it conductive.
I remember there was a commercial back in the 80's for something where they took and immersed a
working television in this liquid and it kept working. Forgot what the commercial was for, but it is possible as long as the liquid is not conductive.
The cost of absolutely pure, zero pH non conductive water is prohibitive. It's required for some military testing of solid film lubricant coated metals. Open the small container for the first time and get one test run. That's it. As soon as pure water comes in contact with air it dissolves CO2 and becomes slightly acidic and conductive, and useless. Freon and other non polar non-flammable solvents, at the operating temperature, are a better choice.
 

Vulcanproject

Posts: 1,553   +2,842
Put it in the building and use the heat exchanger to heat water, the building, the district. Whatever. That's the kind of efficiency the planet needs going forward. It has been done on a small scale before, but if these companies are keen on moving data centres into the city for latency reasons then it could be usable on a much larger scale.
 

TheRealSCDC

Posts: 306   +422
Put it in the building and use the heat exchanger to heat water, the building, the district. Whatever. That's the kind of efficiency the planet needs going forward. It has been done on a small scale before, but if these companies are keen on moving data centres into the city for latency reasons then it could be usable on a much larger scale.

Well, this is a rather good idea. "Tankless water heater" :)
 

PEnnn

Posts: 858   +1,029
This could be very handy for consumers too....considering how toasty Intel CPUs are getting every few months!!
 

takaozo

Posts: 216   +314
The only issue I see here is with connectors, beside general maintenance where you have to let it dry out first.
I dont know if CPU socket, Ram socket or other hi density pin connectors would like the immersion in the long run.
Also the density shown in pictures is hi, the heat exchange on the cooler side must be equal at least if not higher.
Also the pumps.
You kinda move the AC issues to pumps and heat exchangers, leaks and all that came with this.
 

netman

Posts: 818   +368
Also, blame this on global warming during summer months...!

I would still make a single story water proof sealed building (an 8-ft deep large swimming pool with a roof), put the server racks inside, recirculate and cool the water (liquid) by pumping to the outdoor cooling towers. Maintenance should be done by scuba divers for replacing faulty server units without draining the entire building...Problem solved...!
 

Endymio

Posts: 1,804   +1,862
Intel and GRC say that immersion cooling cuts out the need for server fans, which make up 10-15% of a server's power consumption.
From whence did this absurd figure come, that the fans in a server expend 15% of its total power?
 

Geralt

Posts: 1,152   +1,799
H2O isn't conductive, it's all the minerals and what not that make it conductive.
I remember there was a commercial back in the 80's for something where they took and immersed a
working television in this liquid and it kept working. Forgot what the commercial was for, but it is possible as long as the liquid is not conductive.
You're right, water is not conductive. I am a chemical engineer. Yes, minerals in water make it apparently conductive.
 

wiyosaya

Posts: 7,955   +6,998
Well, this is a rather good idea. "Tankless water heater" :)
Thinking along the same lines, buildings where this is installed could use that cooling water (or whatever fluid they use) in a loop and heat the building during the cold months.

Otherwise, this "solution" seems like it is way too overboard, but I expect nothing less coming from Intel.
 

Uncle Al

Posts: 9,072   +8,104
Rather surprised they claim this as new Tech. Di-electric fluids have been around for decades. It only make sense to go this route, although it will be more expensive than a simple cooling fan ....
 

Endymio

Posts: 1,804   +1,862
Server fans are use a ton of power. At full load a single server's fans can use 30 to 75W.
I admit it's not my field, but I find it difficult to believe a 425 watt server with 75 watts of fans is an ideal configuration. Is it?
 

Puiu

Posts: 5,738   +4,676
TechSpot Elite
I admit it's not my field, but I find it difficult to believe a 425 watt server with 75 watts of fans is an ideal configuration. Is it?
it depends on the configuration. servers are generally custom made according to the specifications of the client.

have ever been in a server room? once the fans start spinning up during high loads you will barely be able to even hear yourself speaking. the reason is simple, you don't have the fans attached to the CPU, the CPUs have a massive sink on the and the entire server get massive airflow through it (cooling everything). The fans need to be able to push air through the entire long server blade

this is a good example of a single server blade:

here's how it looks inside (noise included :) ):
 
Last edited:

wiyosaya

Posts: 7,955   +6,998
Rather surprised they claim this as new Tech. Di-electric fluids have been around for decades. It only make sense to go this route, although it will be more expensive than a simple cooling fan ....
Don't forget. This is Intel we are talking about.