
from the cool-as-the-silent-shades-of-sleep dept.
The Weird World Of Liquid Cooling For Datacenters:
When it comes to high-performance desktop PCs, particularly in the world of gaming, water cooling is popular and effective. However, in the world of datacenters, servers rely on traditional air cooling more often than not, in combination with huge AC systems that keep server rooms at the appropriate temperature. However, datacenters can use water cooling, too! It just doesn’t always look quite how you’d expect.
Cooling is of crucial importance to datacenters. Letting hardware get too hot increases failure rates and can even impact service availability. It also uses a huge amount of energy, with cooling accounting for up to 40% of energy use in the average datacenter. This flows into running costs, as well, as energy doesn’t come cheap.
Thus, any efficiency gains in cooling a datacenter can have a multitude of benefits. Outside of just improving reliability and cutting down on emissions through lower energy use, there are benefits to density, too. The more effective cooling available, the more servers and processing power that can be stuffed in a given footprint without running into overheating issues.
Water and liquid cooling techniques can potentially offer a step change in performance relative to traditional air cooling. This is due to the fact that air doesn’t have a great heat capacity compared to water or other special liquid coolants. It’s much easier to transfer a great quantity of heat into a liquid. In some jurisdictions, there is even talk of using the waste heat from datacenters to provide district heating, which is much easier with a source of hot liquid carrying waste heat vs. hot air.
However, liquid cooling comes with drawbacks, too. Leaks can damage electronics if not properly managed, and such systems typically come with added complexity versus running simple fans and air conditioning systems. Naturally, that improved cooling performance comes at a trade-off, else it would be the norm already.
[...] More extreme methods, exist, too. Microsoft made waves by running a fully-submerged datacenter off the coast of Scotland back in 2018. With a cluster of conventional servers installed in a watertight tube, heat was rejected to the surrounding waters which kept temperatures very stable. The project ran for two years, and found that the sealed atmosphere and low temperatures were likely responsible for an eight-fold increase in reliability. Project Natick, as it was known, also promised other benefits, such as reduced land costs from locating the hardware offshore.
Microsoft isn’t resting on its laurels, though, and has investigated even wilder concepts of late. The company has developed a two-phase immersion cooling tank for datacenter use. In this design, conventional servers are submerged in a proprietary liquid developed by 3M, which boils at a low temperature of just 50 C (122 F). As the server hardware heats up, the liquid heats up. It sucks up huge amounts of energy in what is called the latent heat of vaporization, required for the liquid to boil. The gaseous coolant then reaches the condenser on the tank lid, turning back to liquid and raining back down on the servers below.
(Score: 2) by MostCynical on Wednesday June 15 2022, @08:48AM (7 children)
what is the conductivity of this liquid?
how much of the server is 'immersed'?
how easy is it to service?
so many questions...
"I guess once you start doubting, there's no end to it." -Batou, Ghost in the Shell: Stand Alone Complex
(Score: 4, Informative) by RS3 on Wednesday June 15 2022, @09:01AM (3 children)
The Cray-2 [wikipedia.org] circuits were fully immersed in 3M Fluorinert [wikipedia.org] which is non-conductive. This newer stuff ("proprietary liquid developed by 3M") is possibly a refinement of Fluorinert.
(Score: 1, Informative) by Anonymous Coward on Thursday June 16 2022, @07:05AM (2 children)
3M has multiple brands of such chemicals. Rather than the Fluorinert family, it is probably one of the Novec brand of engineered fluids. A quick shout to the next desk over gave the suggestion that it is 649, which 3M has been really pushing lately and has similar properties to the one in TFS.
(Score: 2) by RS3 on Thursday June 16 2022, @07:40AM (1 child)
Thanks AC. So I'm wondering how repairs work. I'd assume there's a storage tank awaiting in case the system needs to be drained...
I was envisioning racks full of "blade" servers, each immersed in the 3M fluid. But that'd be pretty difficult to deal with repairing, so maybe each server has internal encasement around RAM and other circuits? Or maybe just liquid in heat sinks thermally attached to big chips, RAM, CPU and GPU of course, like you can add to run of the mill PCs.
Some years ago a friend gave me a Mac G5 aluminum box. It's one of the ones Apple made with liquid cooling. They're infamous for the system leaking and wrecking havoc on the MB ("logic board" in AppleSpeak) and power supply. Mine had just started leaking, so I was able to rescue it. I used it a bit- good fast machine for its day. I haven't powered it up in a few years; hoping the coolant didn't leak again...
(Score: 0) by Anonymous Coward on Saturday June 18 2022, @03:04AM
It depends on the type of system. For stand-alone single-phase radiator systems, you remove the rack just like any other. The piping may be rigid or not, but you usually don't work on those parts standing in the aisle. Immersion systems are similar to work on, you just pull the unit out of the bath and wait for the low-viscosity fluid to drain or just use proper PPE for the fluid. Blade systems and rack systems will have a quick-connect system, that may or may not require draining if they are rigid and you need to move the piping (and may or may not require prefilling or ramp up on install), but are otherwise dry. Rain systems require some sort of deactivation, like turning off a valve or diverter. There are other systems too, but that is more on the HVAC/refrigeration side than on the IT side. All systems will have a reservior, leak detection, heat sinks, and standard heat flow, which means it isn't too different from air-cooled at some level.
(Score: 3, Interesting) by janrinok on Wednesday June 15 2022, @10:18AM (1 child)
Well, in this instance it is sea water.
The linked story admits that liquid coolants bring their own problems.
[nostyle RIP 06 May 2025]
(Score: 4, Funny) by driverless on Wednesday June 15 2022, @05:32PM
And they didn't really make waves either, at most a little movement over thermal gradients. Typical press-release hype.
(Score: 5, Informative) by RamiK on Wednesday June 15 2022, @01:14PM
The article is mostly about:
1. Conventional liquid cooling where heat exchange is done through copper pipes and sealed heat sinks to a radiator with passive or active (fan) cooling on that. As commonly done in gamer boxes.
2. The Microsoft submerged data centers where they sealed servers in air tight containers and dumped them in the ocean. There's some variations where the containers act as liquid cooling radiators but there's also variations where it's just conventional air fans doing the heat exchange between the PC and the container.
What you're thinking about is is mineral oil submersion and the likes: https://www.youtube.com/watch?v=M80eUcUVrmw [youtube.com] https://www.youtube.com/watch?v=Ya-Xt3Y16Lw [youtube.com]
compiling...
(Score: 2) by TrentDavey on Wednesday June 15 2022, @05:25PM (1 child)
... seeing a television operating while immersed in some liquid at the Ontario Science Center in Toronto when I was a boy: I'm 62 now.
(Score: 0) by Anonymous Coward on Thursday June 16 2022, @08:48PM
Was it urine? Wouldn't surprize me, disgusting place.