Arthur T Knackerbracket has processed the following story:
The explosive growth of datacenters that followed ChatGPT's debut in 2022 has shone a spotlight on the environmental impact of these power-hungry facilities.
But it's not just power we have to worry about. These facilities are capable of sucking down prodigious quantities of water.
In the US, datacenters can consume anywhere between 300,000 and four million gallons of water a day to keep the compute housed within them cool, Austin Shelnutt of Texas-based Strategic Thermal Labs explained in a presentation at SC24 in Atlanta this fall.
We'll get to why some datacenters use more water than others in a bit, but in some regions rates of consumption are as high as 25 percent of the municipality's water supply.
This level of water consumption, understandably, has led to concerns over water scarcity and desertification, which were already problematic due to climate change, and have only been exacerbated by the proliferation of generative AI. Today, the AI datacenters built to train these models often require tens of thousands of GPUs, each capable of generating 1,200 watts of power and heat.
However, over the next few years, hyperscalers, cloud providers, and model builders plan to deploy millions of GPUs and other AI accelerators requiring gigawatts of energy, and that means even higher rates of water consumption.
[...] One of the reasons that datacenter operators have gravitated toward evaporative coolers is because they're so cheap to operate compared to alternative technologies.
[...] In terms of energy consumption, this makes an evaporatively cooled datacenter far more energy efficient than one that doesn't consume water, and that translates to a lower operating cost.
[...] "You have to understand water is a scarce resource. Everybody has to start at that base point," he explained. "You have to be good stewards of that resource just to ensure that you're utilizing it effectively."
[...] While dry coolers and chillers may not consume water onsite, they aren't without compromise. These technologies consume substantially more power from the local grid and potentially result in higher indirect water consumption.
According to the US Energy Information Administration, the US sources roughly 89 percent of its power from natural gas, nuclear, and coal plants. Many of these plants employ steam turbines to generate power, which consumes a lot of water in the process.
[...] Understanding that datacenters are, with few exceptions, always going to use some amount of water, there are still plenty of ways operators are looking to reduce direct and indirect consumption.
[...] In locations where free cooling and heat reuse aren't practical, shifting to direct-to-chip and immersion liquid cooling (DLC) for AI clusters, which, by the way, is a closed loop that doesn't really consume water, can facilitate the use of dry coolers. While dry coolers are still more energy-intensive than evaporative coolers, the substantially lower and therefore better power use effectiveness (PUE) of liquid cooling could make up the difference.
[...] While datacenter water consumption remains a topic of concern, particularly in drought-prone areas, Shelnutt argues the bigger issue is where the water used by these facilities is coming from.
"Planet Earth has no shortage of water. What planet Earth has a shortage of, in some cases, is regional drinkable water, and there is a water distribution scarcity issue in certain parts of the world," he said.
To address these concerns, Shelnutt suggests datacenter operators should be investing in desalination plants, water distribution networks, on-premises wastewater treatment facilities, and non-potable storage to support broader adoption of evaporative coolers.
While the idea of first desalinating and then shipping water by pipeline or train might sound cost-prohibitive, many hyperscalers have already committed hundreds of millions of dollars to securing onsite nuclear power over the next few years. As such, investing in water desalination and transportation may not be so far fetched.
More importantly, Shelnutt claims that desalinating and shipping water from the coasts is still more efficient than using dry coolers or refrigerant-based cooling tech.
(Score: 4, Touché) by Gaaark on Sunday January 12, @03:30PM
We ARE the most intelligent species in the world... right?
Right?
I think i need to become a Mennonite or something...
--- Please remind me if I haven't been civil to you: I'm channeling MDC. I have always been here. ---Gaaark 2.0 --
(Score: 4, Insightful) by VLM on Sunday January 12, @04:49PM (5 children)
Do they actually do that or is this just "green" reporting about an anecdote?
I worked at a IBM mainframe dino pen a LONG time ago and even then, in an area that gets 3 to 4 feet of rainwater annually, evap just didn't work financially.
I'm surprised fresh water is so much cheaper decades later that they can just boil it away. It just seems a strange financial strategy, or perhaps a not entirely honest semi-propaganda piece.
(Score: 3, Interesting) by khallow on Sunday January 12, @06:42PM (4 children)
It works better with low humidity. Great cooling for a desert, should you have the water.
I think it's a case of the Bike Shed effect. Datacenters and what they do is relatively complicated. Talking about how much water they use is not.
(Score: 2) by VLM on Sunday January 12, @08:39PM (3 children)
LOL I see. The four feet of rain we get annually comes with a rather high dew point.
Unfortunate that the places with the most water to use for cooling have the highest dew point and the places with the lowest dew points have the least water. Just the inherent way of that technology.
(Score: 4, Insightful) by anubi on Sunday January 12, @10:28PM (2 children)
Maybe put the data centers in Canada or Greenland, and actually use the "waste heat"?
"Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
(Score: 3, Interesting) by jasassin on Monday January 13, @06:15AM (1 child)
A friend of a friend has been running a bitcoin mining operation since around when bitcoin was created. He uses the heat from his servers to heat his house in winter (he just leaves the thermostat off). I found that quite interesting and, for some reason, slightly comical.
I'm totally with you on the concept of using this "waste heat" to heat peoples houses in the winter. I'm thinking a huge data center in the middle and giant apartment buildings around it (a-la Sim City).
Maybe legislation mandating the companies build housing around data centers to use the heat or something like that... maybe, I don't know... just a thought (no idea how that would work out).
It seems evil if they just blow all that heat directly into the sky.
I'm glad I'm around 50 years old because this planet is going down faster than I expected. This A.I. shit is just crazy. H200's spewing out heat to read me a chicken soup recipe. A.I. to generate a few extra fingers. A.I. to fabricate things, because I guess it's impossible for A.I. to say "I don't know."
This is a classic case of who asked for this? People (not scumbag MBA'S who want A.I. to replace their employees) don't want this A.I. bullshit, and I can't wait until the bubble bursts [wikipedia.org] like some .com Déjà vu
As long as these A.I. assholes don't get bailed out like Fannie Mae & Freddie Mac for crying out loud!
P.S. Fuck A.I.!
jasassin@gmail.com GPG Key ID: 0xE6462C68A9A3DB5A
(Score: 2) by VLM on Monday January 13, @01:07PM
There's an EE/ham radio youtuber who does the same thing, he rarely posts videos now, so I don't recall who. He has entire videos on this topic.
I find that enough servers in the basement will drop enough watts to lower the humidity percentage to a nice level while also keeping my basement almost comfy without opening the HVAC vents. Where I live, electricity is cheap enough that its only a couple times more expensive than burning natgas and a couple hundred watts is a lot cheaper of a hobby than something like golf or even serious professional sports viewing. I'm barely paying over a dollar per watt-year and I spend more on hardware than on watts.
I would suggest setting the thermostat to 65F or something to keep backup heat when it's far enough below zero outside.
(Score: 5, Interesting) by VLM on Sunday January 12, @05:06PM
Lets say you have a one-acre data center. Small by some standards. Where I live we always get more than 3 feet of rain per year often over 4 but to make a pessimistic engineering estimate, we'll call it only 3 acre-feet of water run off the roof per year. An acre-foot of water is close enough to 326K gallons of water. Evaporating a gallon of boiling water drops about 9200 BTU we can call it 10K total (rain is much colder than boiling, rough estimate etc).
So evaporating the rooftop rain would take roughly one billion BTU. Usually comes from sunlight and quite a bit runs off into rivers, of course. A billion BTU is about a third of a million KWH. 300e3/(24*365) = meh 35.
So if a one acre data center stored all its rainwater and evaporated it away it could run at a continuous power input for the whole center of 35 KW. Lets say a typical datacenter power density is 1 MW/acre. So handwavy if you don't connect the data center to the public water utility (other than for the company break room, bathrooms, etc) then you need about 25-ish times the rooftop area to gather rainfall. High. But not ridiculously improbable.
AWS operates 38 million sq feet of data center worldwide right now according to their own propaganda. The same peeps claim Amazon the retailer has 320 million sq feet of warehouse just in the USA. More or less its a reasonable hand waving estimate that if Amazon stored all the rain water than landed on the rooftops of their warehouses and data centers, they'd have "about enough" to evaporatively cool all their data centers without using any public utility water at all. Assuming their facilities are located on the east side of the Mississippi River and not in a wildfire drought area in LA,CA or middle of a desert in Utah. Interesting.
Something strange to think about, is given infinite electricity, perhaps from nukes, some tropical areas exceed ten feet of rain annually, that improves the ratio quite a bit. Ironically if you "must" evaporatively cool, you're better off doing that at a very high latitude where the outside air temp is low most of the year or deep in the south where you get many feet of cooling rain per year but mid-lattitudes where I live where we only get "forty inches" of rain per year are the worst place to build a DC.
(Score: 2, Interesting) by shellsterdude on Sunday January 12, @07:12PM (1 child)
In my town there is a giant "artistic" fountain that is just a giant tube point up at a 45 degree angle pouring water out in a column into a drain. Now, I'm sure some water gets lost in the splashing and evaporating. That said, you have to be pretty dense to assume they are just running water down the drain. In reality it's nearly a closed system with the water in the "drain" being pumped back through the pipe. I assume datacenter do the same thing. Water goes through the system, through a bunch of radiators to disperse heat, through some sort of filtration system, and then back through the whole system again. So saying that data centers "consume" 4 million gallons of water a day is pretty disingenuous. This kind of reporting comes across as about as genuine as the concern over straws in the central US ending up in sea turtle noses. It's just not an issue. Water conservation is important, but the fact that something requires a fairly large capacity of water as an initial startup, isn't that big of an issue as long as that initial water is drawn during low demand and stored.
(Score: 0) by Anonymous Coward on Sunday January 12, @08:49PM
If you just spray water mist on your cooling radiators, you can improve their cooling pretty drastically. Evaporating water sucks up a lot of heat. You don't get to recycle that water without throwing away the cheap cooling.