From Ars Technica, word that Microsoft is deploying pods with servers underwater.
Microsoft CEO Satya Nadella says that underwater server farms are part of the company's plans for future data centers.
Microsoft has been experimenting with underwater servers for some time.
Project Natick[*] put a server pod underwater off the coast of California in 2016. Naturally enough, the pod uses water cooling, dumping waste heat into the ocean around it. It's designed as a sealed unit, deployed for five years before being brought back up to the surface and replaced. Since then, Microsoft has deployed a larger pod off the coast of Scotland.
[*] [Natick is the name of a town in eastern Massachusetts which also happens to have a US Army Research Facility located in it. --Ed.]
The pod people are no longer people! Flash in the pan idea, or could it have some traction?
(Score: 2) by rleigh on Saturday November 03 2018, @12:09PM (5 children)
Cracks could be the slightest microscopic flake in the paint, for barnacles, mussels, seaweed or other life to cling on. They are experts at it. Making it hotter will only make the problem worse. You aren't going to heat anything to 90 in the ocean. You'll fry all the gear inside before you raise the outside temperature significantly--that's why it's placed in the ocean, after all, being a heat sink of ~infinite capacity for all intents and purposes.
(Score: 2) by Bot on Saturday November 03 2018, @01:01PM (3 children)
khallow, as the usual apparently liberal guy who is in fact a closet commie mass murderer, has a point nonetheless. Instead of submerging equipment (the worst possible solution, in pure Microsoft fashion), you let the heat exchange get quite hot, you submerge it for 30 seconds, you make it reemerge and heat up again. Even if I don't see why growing mussles as a side business should be discounted out of principle. I've seen worse dotcom investments than that.
Account abandoned.
(Score: 2) by rleigh on Saturday November 03 2018, @02:31PM (1 child)
I'm not convinced by the submerging that much, to be honest. Why not pump the seawater from a lower depth, and run it though a heat exchanger. All the equipment can be dry and serviceable, and also not in direct contact with saltwater. You would have the benefit of a larger temperature gradient by using colder water, and you can use a sterile freshwater loop on the other side of the paraflow to minimise corrosion.
(Score: 2) by fyngyrz on Saturday November 03 2018, @09:49PM
Well, if you make heat in cold water, you don't need to pump; the warmed water will rise, and colder water will replace it. Natural convection will do the coolant circulation for you, as warmer water is less dense and will naturally rise out of the colder mass around it. So less hardware, less energy spent. And overall, less heat produced, simply because less energy spent — the pump solution heats the water just as much, but also consumes power and generates its own waste heat.
Ideally, the servers will get more and more efficient as the designs iterate, and less waste heat will be generated. Or we can put them out in space with huuuuuge radiators, or on the moon with heat sinks jammed into the lunar surface. Or something along those lines.
There would be a bit of a latency issue to deal with... :)
(Score: 0) by Anonymous Coward on Sunday November 04 2018, @11:48AM
Why not use oil?
https://www.geek.com/geek-cetera/cool-your-pc-by-submerging-it-in-oil-1495403/ [geek.com]
(Score: 1) by khallow on Thursday November 08 2018, @02:51PM
And there aren't many of those produced over five years.
Volcanoes and geothermal springs do that all the time. All it takes in the case of the above heat sink is enough heat combined with a temporary disruption of convection. And you don't have to heat up your entire heat sink surface at once in order to do it so you can continue to dump heat to ocean while this is going on.