Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Thursday December 15, @10:13AM   Printer-friendly

Putting spinning rust in liquid might seem risky, but sealed helium drives were used:

Immersion cooling specialist Iceotope has published a study sharing its findings in the wake of a series of tests completed at one of Meta's (Facebook) data centers. The study looked carefully at the pros and cons of precision single-phase immersion cooling in businesses that use high-density data storage servers. Iceotope asserts that its results were "conclusive" in demonstrating this cooling methodology is a superior solution when compared to air cooling, as well as other forms of liquid cooling such as cold plates, tank immersion, or two-phase immersion.

[...] In the tests, a standard air-cooled commercial storage system with 72 HDDs and supporting components was re-engineered to work with Iceotope's precision single-phase immersion cooling. Specifically, the modified system used a dedicated dielectric loop connected to a liquid-to-liquid heat exchanger and pump. Single-phase cooling is much simpler than dual-phase - where the coolant boils from liquid to gas, travels into a condenser and then flows back into the system (hence dual-phase). Instead with single-phase, the coolant just flows around the hotter and cooler areas of the loop, doing its job without any phase change.

Four main observations were made by the Iceotope testing team. Firstly, the 72 HDDs showed very little variance in temperature (just 3° C) wherever they were located in the server array. It is important to highlight that the storage array used hermetically sealed helium-filled HDDs. Secondly, the liquid could climb in temperature to an easily manageable 40°C with no impact on reliability. Thirdly, the power consumption of the cooling system was <5% of the system total. Lastly, it was noticed that the single-phase precision cooling was virtually silent and vibration free.

According to Seagate, 90% of cloud storage still uses mechanical magnetic storage technology. Note that Iceotope sells this cooling technology, so take the results with a grain of salt.


Original Submission

Related Stories

Scientists Scrambling to Prevent Global Data Storage Crisis 27 comments

Servers around the world could soon face a massive data storage crunch, thanks to the "mind-blowing amount" of information people store digitally every day:

Researchers from Aston University say the global datasphere — the total amount of data worldwide — will increase by 300 percent within the next three years. Currently, all of this data sits in banks of servers stored in huge warehouses (data centers).

Unfortunately, the answer to creating more space in "the cloud" is not just to build more server warehouses. The Aston team says data centers already use up 1.5 percent of the world's electricity every year. That makes endlessly building new facilities just for massive servers an unsustainable practice.

With that in mind, scientists are now working on creating new data storage surfaces which are just five nanometers in width. That's about 10,000 times smaller than the width of a human hair! At the same time, they'll be able to increase data storage capacity on digital devices — since there will likely be no stopping the amount of information people store digitally every second of every day.

[...] "Increasing the efficiency of existing technologies will significantly reduce the need for costly, environmentally damaging construction of new 'mega data centers.' The next three years will be crucial. The global datasphere is predicted to increase to 175 zettabytes, with one zettabyte being approximately equal to one billion terabytes," [researcher in materials chemistry Dr. Amit Kumar] Sarkar, concludes.

Related: Liquid Cooled HDD Study Touts Greater Reliability, Lower TCO


Original Submission

This discussion was created by janrinok (52) for logged-in users only, but now has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 2) by VLM on Thursday December 15, @02:48PM (2 children)

    by VLM (445) on Thursday December 15, @02:48PM (#1282532)

    I worked at a place that had legacy water cooled mainframes in the 90s, or at least still had all the infra for water cooling still under the raised floor.

    1) Anything that can leak, will eventually leak.

    2) There are expensive centralized monitoring humidity sensors that can detect those leaks that don't work and are great at false positives AND false negatives.

    3) The TCO of liquid cooling is so high that even mainframe shops were like "F that".

    4) The fire dept goes nuts over raised floors. Its a pretty serious fire danger. No problem, just replace all your shitty cheap PVC cabling with plenum rated cable that costs twice as much. I have not researched this product, perhaps they do cooling without absolutely requiring a raised floor. But... now you got power, ethernet, AND cooling running in three overhead trays, or even worse, mixed together? Also the idea of running a liquid system "over" the equipment seems riskier than running "under".

    • (Score: 2) by richtopia on Thursday December 15, @03:55PM (1 child)

      by richtopia (3160) Subscriber Badge on Thursday December 15, @03:55PM (#1282547) Homepage Journal

      Your reasons are all valid, and my understanding is those reasons are why almost all mainframes are air cooled.

      Regarding hard drives, I suspect the liquid cooling could be more reliable. With a compute chip, you need to apply a cold plate directly to the chip and have tubing running to the chip. With a HDD, the drive is already rectangular and aluminum framed, so you can effectively cool almost any edge of the drive. If I was building a rack to cool hard drives, I would have the rails with the mounting screws made of aluminum and actively cool that. However, the article lacks clarity on the implementation; it appears they have some sort of immersion strategy which sounds overly complex to me.

      The last aspect that wasn't mentioned in the article is power costs. Depending how they cool the fluid they may be able to cut down on air conditioning costs. Especially with data storage, minimizing operating cost is the name of the game.

      • (Score: 2) by mhajicek on Thursday December 15, @07:05PM

        by mhajicek (51) Subscriber Badge on Thursday December 15, @07:05PM (#1282570)

        Unless you're sinking heat into a large body of water (lake or ocean), liquid cooling is just remote air cooling anyway. I don't have experience with server racks, but in an overclocked CADCAM workstation you can get the same results with a large air cooler as with liquid, for less money and with greater reliability.

        --
        The spacelike surfaces of time foliations can have a cusp at the surface of discontinuity. - P. Hajicek
  • (Score: 2) by Sjolfr on Thursday December 15, @07:14PM (1 child)

    by Sjolfr (17977) on Thursday December 15, @07:14PM (#1282572)

    How many data centers capture the heat to redistribute? I don't think that I've ever heard of one that does. Makes me wonder if it's just not cost effective or what? Water heating, building heat in the winter, etc.

(1)