Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 14 submissions in the queue.

Submission Preview

Link to Story

How Your Utility Bills are Subsidizing Power-Hungry AI

Accepted submission by upstart at 2025-09-24 08:54:39
News

████ # This file was generated bot-o-matically! Edit at your own risk. ████

How Your Utility Bills Are Subsidizing Power-Hungry AI [techpolicy.press]:

This summer, across the Eastern United States, home electricity bills have been rising. From Pittsburgh to Ohio, people are paying $10 to $27 more per month for electricity [washingtonpost.com]. The reason? The rising costs of powering data centers running AI. As providers of the largest and most compute-intensive AI models keep adding them into more and more aspects of our digital lives with little regard for efficiency (and without giving users much of a choice), they grow increasingly dependent on a growing share of the existing energy and natural resources, leading to rising costs for everyone else.

AI data centers are expanding—but who’s paying for it?

In particular, this means that average citizens living in states that host data centers bear the cost of these choices, even though they rarely reap any benefits themselves. This is because data centers are connected to the whole world via the Internet, but use energy locally, where they’re physically located. And unlike the apartments, offices, and buildings connected to a traditional energy grid, the energy use of AI data centers is highly concentrated; think as much as an entire metal smelting plant over a location the size of a small warehouse. For example, the state of Virginia is home to 35% of all known AI data centers worldwide [vedp.org], and together they use more than a quarter of the state’s electricity. And they’re expanding fast [iea.org] — in the last 7 years, global energy use by data centers has grown 12% a year, and it’s set to more than double by 2030, using as much electricity as the whole of Japan.

The costs of this brash expansion of data centers for AI are reflected first and foremost in the energy bills of everyday consumers. In the United States, utility companies fund infrastructure projects by raising the costs of their services for their entire client base (who often have no choice in who provides them with electricity [bloomberglaw.com]). These increased rates are then leveraged to expand the energy grid to connect new data centers to new and existing energy sources and build mechanisms to keep the grid balanced despite the increased ebb and flow of supply and demand, particularly in places like Virginia that have a high concentration of data centers. Also, on top of amortizing the base infrastructure cost, electricity prices fluctuate based on demand, which means that the cost of having your lights on or running your AC will rise when there is a high demand from data centers on the same grid.

These costs also come with dire impacts on the stability of an energy infrastructure that is already stretched to the breaking point by growing temperatures and extreme weather. In fact, last summer, a lightning storm caused a surge protector to fail near Fairfax, Virginia, which resulted in 200 data centers switching to local generators, [reuters.com] causing the demand on the local energy grid to plummet drastically. This nearly caused a grid-wide blackout and, for the first time, made federal regulators recognize data centers as a new source of instability in power supplies, on top of natural disasters and accidents.

What do these data centers even do?

Needing more data centers in an increasingly digitized world is not surprising; it’s not even necessarily a bad thing, as long as we keep track of our ability to sustainably meet the increased demands on common resources, and have some commonly agreed standards for what constitutes a reasonable cost-benefit trade-off. Unfortunately, most current development and deployment of AI tends to skip this latter part entirely, due in great part to a general tendency of the providers to obfuscate the costs while overplaying the scale and distribution of the benefits [ft.com] making the latter particularly difficult.

Our Content delivered to your inbox. Join our newsletter on issues and ideas at the intersection of tech & democracySubscribe Loading...Thank you!

You have successfully joined our subscriber list.

While we do still have a long way to go in better understanding how different aspects of these costs fit together, we do have some meaningful pieces we can start putting together. First, given a desired use of AI, efficiency considerations matter very much, often by several orders of magnitude — for tasks like image and video generation, specific development choices can make the same task up to 50 times more expensive [huggingface.co]. For applications built on large language models, we now have billion-parameter models [huggingface.co] (yes, these are the small ones) that can perform similarly to their trillion-parameter counterparts on a growing range of tasks, especially with a little extra adaptation work before deployment in a particular use case, using techniques like knowledge distillation. [huggingface.co]

Sadly, efficiency isn’t incentivized in the current state of AI — when a large company wants to deploy AI systems for billions of users, having a few instances of deploying a slightly less powerful model can present a huge reputational risk in a competitive market. This means that it’s simpler to just deploy the most powerful and expensive model everywhere — even if that means a 100 or 1000x compute and energy expenditure.

And currently, all companies want to be the first to introduce more and more generative AI “capabilities” into new and existing digital products and services, which use significantly more energy than previous generations that relied on simpler models and approaches. Work [acm.org] that we published last year found that the difference between task-specific models and multi-purpose models for tasks like extractive question answering can be up to 30-fold, which adds up, given the billions of users that use AI tools every day. And despite improvements in hardware and software efficiency, overall energy use is still rising.

How are US states and other countries handling this problem?

While recent decisions at the federal level in the US have been more aligned with a “full speed ahead” approach to data center infrastructure, states such as Ohio [powermag.com] and Georgia [datacenterfrontier.com] are passing laws that would put the onus on data centers, not consumers, to pay the cost of new investments to expand the power grid. Countries such as the Netherlands [datacenterdynamics.com] and Ireland [simmons-simmons.com] have gone a step further, putting moratoriums on the construction of new data centers in key regions until grid operators can stabilize existing grids and make sure that they don’t get overwhelmed.

But we still need to rethink our relationship with multi-purpose, generative AI approaches. At a high level, we need to move away from treating AI as a centrally developed commodity, where developers need to push adoption across the board to justify rising cost in a vicious cycle that leads to ever costlier technology, and toward AI that is developed based on specific demands, using the right tool and the right model to solve real problems at reasonable cost. This would enable choosing smaller, task-specific models [huggingface.co] for tasks like question answering and classification, using and sharing open-source models [huggingface.co] to allow incremental progress and reuse of models by the community, and incentivize measuring and disclosing the energy consumption of models using approaches like AI Energy Score. [huggingface.co]

The next few years will be pivotal for determining the future of AI and its impact on energy grids worldwide. It’s important to keep efficiency and transparency at the heart of the decisions we make. A future when AI is more decentralized, efficient and community-driven can ensure that we are not collectively paying the price for the profits of a few.

How AI and data center growth drive copper demand in the US [fastmarkets.com]:

The growing demand for copper [fastmarkets.com] is no secret: over the last year the need for copper — as well as motivation for more domestically produced copper in the US — has been widely-discussed among market participants, major companies, the current administration [fastmarkets.com] and even the general public. Copper demand in data centers is also increasing rapidly. The country and the world, needs copper and in the next ten years they’re only going to need more.

At the same time, conversations surrounding sustainability and energy needs in the US have changed over the last year, with the focus shifting from electric vehicles (EVs) to artificial intelligence (AI). And with this rise comes heightened demand for data centers — where copper plays a huge part.

According to Fastmarkets’ analysts [fastmarkets.com] latest 10-year outlook report, [fastmarkets.com] published in May, copper consumption from energy transition sectors will rise at a compound annual growth rate (CAGR) of 8.9% in the next 10 years — including 10.4% for the EV sector, 6.8% for the solar power industry and 7.8% for the wind power industry — while consumption from traditional non-energy transition sectors will rise at a CAGR of 1.1%. A recent report by Macquarie also estimated that between 330,000 and 420,000 tonnes of copper will be used in data centers by 2030 [fastmarkets.com].

How rising energy needs increase copper demand in data centers

Fastmarkets spoke separately with Don Leavens [nema.org], chief economist and senior vice president of the National Electrical Manufacturers Association (NEMA) [makeitelectric.org], and Adam Kotrba [copper.org], flat products director of the Copper Development Association (CDA) [copper.org], about the role of copper in data center growth in North America’s evolving digital infrastructure.

Copper: A crucial element in data centers

According to the CDA, copper demand in North America’s data center sector is being driven by several converging factors: the pace of new construction is fast, pushing demand higher and data centers themselves require large volumes of copper for power distribution and grounding.

Leavens said the data centers themselves are the biggest factor: an estimated 30-40% of data center construction involves electrical, all of which contains copper.

“Copper is key to anything electrical because of its ability to conduct electricity efficiently. It’s used in almost every application where we’re transmitting electricity. And for data centers that have a particularly high demand for energy and electricity, it comes through in a variety of ways,” Leavens told Fastmarkets on August 27.

Access data-driven insights, prices, forecasts and market news [fastmarkets.com]

“First of all, you need to get power from the generators and utilities, and that has to come over the wires and through transmission distribution. The distribution has to take that power to the data center, so that’s part of construction,” Leavens said. Additionally, there are transformers and switch gear, all of which have copper embedded, that bring power into the data center.

But one of the biggest factors in these centers is the adoption of liquid cooling technologies, which rely on copper cooling plates on each computer chip. According to a 2024 study conducted by NEMA, close to 30% of energy demand was related to cooling.

Artificial intelligence is driving unprecedented copper use in data centers

“Particularly now with the rise of AI, we are using more energy, which means there’s more heat… which means you need more cooling,” Leavens said.

“Newer data centers — especially those built for artificial intelligence — are designed to handle far greater power loads, which means even more copper is needed to safely manage, transmit and ground electricity,” Kotbra told Fastmarkets on Wednesday September 3.

“Together, rapid capacity growth, higher power intensity and advanced cooling methods are making copper more critical than ever to the data center industry,” Kotbra said.

Leavens also noted that when the US government calculates the data for construction, it is only looking at what is permanent in the building, which includes the shell of the building and the wires within. Since the servers are not permanent, they are not included in the same construction statistics.

“So, it’s a little confusing how the government records it,” Leavens said.

A shift in focus from EVs to AI

Just a year ago, the focus across US industries was EVs, which Fastmarkets saw widely discussed in 2024 and early 2025. But recently, as the use of AI has skyrocketed, conversations have shifted in this direction as well.

“AI from now to 2035 is the big story; it’s growing very fast. A year ago, we were saying 2035 would be the pickup in EVs. Now AI is the bigger story… at least in the US,” Leavens told Fastmarkets, noting that this is not necessarily the case in other parts of the world.

And the CDA told Fastmarkets that while EVs currently account for more copper use globally, the trajectory of US data center expansion is remarkable.

“Depending on how EV sales trend in the next few years, domestic data center demand could rival or even surpass EVs when it comes to copper consumption,” Kotbra said.

“Even advocates of EVs are changing their language and how they talk about it,” Leavens told Fastmarkets, saying that EVs will continue to evolve in a free market setting, rather than being subsidized. “[They] just won’t have that extra turbo lift from subsidization,” he said.

According to a National Mining Association (NMA) [nma.org] report published on August 19, copper consumption is expected to increase by more than 110 percent by 2050, while US energy demand is projected to increase by as much as 30% in the same period of time, and the AI industry is projected to reach trillions of dollars in market value in just the next few years.

“Artificial intelligence is rapidly transforming industries worldwide… its ability to rapidly analyze data, learn from patterns and make real-time decisions is driving unprecedented growth and innovation,” Rich Nolan [nma.org], NMA’s chief executive officer, wrote in an August 19 report.

Sustainability goals reshape copper demand in data centers

As the need for copper grows and data centers expand, so does the potential for innovation, sources said, and sustainability is a key factor in the copper business and the energy industry, especially globally.

Sustainability goals are reshaping how data centers think about copper procurement, Kotbra said, with operators increasingly focused on sourcing copper products with higher recycled content, underscoring the importance of accurately quantifying and verifying those amounts.

“This push for transparency not only supports corporate sustainability commitments, but it also drives demand for recycling infrastructure that keeps copper in circulation,” Kotbra said.

On the topic of recyclability, Leavens said “our companies are facilitating that re-sensibility through design of products. We recognize that if you bury the copper where it’s difficult to get it out of the device, you’re not making it as sustainable as it could be; so they’re thinking ahead [about the] end-of-life of this product how easy could it be to take it apart and pull to get the copper out.”

Kotbra agreed that recyclability is one of the most important innovations in the data center industry, citing similar considerations for end-of-life recovery that make it easier to separate copper from other materials and ensure it can be reclaimed at high rates.

“Since copper is infinitely recyclable without losing performance, these design improvements are helping the industry capture even more of its value and keep it in circulation for generations to come,” Kotbra told Fastmarkets.

“This is a global market, and sustainability is big globally. You have to sell to a global market and your products have to be looking at that,” Leavens told Fastmarkets.

Keeping up with demand

Just as there are timing concerns with mining projects, which take a minimum of ten years to come online, data centers are also experience a lag in timing. But rather than being due to copper supply itself, this is largely on account of manufacturing bottlenecks, especially with the supply of transformers, which are hand-wound in a process that cannot be automated, and switch gear.

“I think the biggest hitch has been a steady supply of transformers. Some have a backlog of two to five years, so when you say that to a hyperscaler who wants to get this built tomorrow… they’re not happy. We can get our hands on raw copper, but it takes a while to make those derivatives. And price volatility has impacted the equipment,” Leavens told Fastmarkets.

“These challenges highlight the need for better planning, stronger domestic manufacturing and smart policy support to keep critical projects on schedule,” Kotbra said.

Leavens said that volatility in prices impacts the market because it’s cyclical, so market participants try to hedge by stocking up. It can be difficult to plan with projects on the horizon up to five years ahead, and the copper price experiencing volatility can have a real material impact on cost overruns.

On February 25, US President Donald Trump announced a Section 232 investigation into copper due to the critical nature of the material for national security, infrastructure and energy.

During this period of uncertainty in the market, Fastmarkets’ assessment of the copper grade 1 cathode all-in price, ddp Midwest US [fastmarkets.com] reached an all-time high of $5.875-5.895 per lb on July 23. And Fastmarkets’ weekly assessment of the copper scrap No1 copper, discount, buying price, delivered to brass mill US [fastmarkets.com] reached $(1.15)-(1.10) per lb on July 16.

Since Trump’s surprise announcement of the exclusion of refined and scrap copper [fastmarkets.com], market sentiment has calmed, with the all-in price last assessed at $4.4305-4.5505 per lb on September 8. Meanwhile, the copper scrap discount was last assessed at $(0.29)-(0.24) per lb on September 3.

“With copper expected to be included on the final 2025 [US Geological Survey] Critical Minerals [fastmarkets.com] list, we have an opportunity to streamline those processes and unlock more domestic capacity. The copper is here in the United States — we simply need the policies and permitting reforms to access it and meet the nation’s growing demand,” Kotbra said.

“Copper is a vital part of every economy globally,” Leavens told Fastmarkets. “I hear so much about the security risk for copper — it’s a critical material now, and the government recognizes it as that. We need it; but also there are huge sources of it around, through stockpiles or friendly countries such as Chile or Australia. As an economist, I’m just not concerned that there would be an absolute shutdown like you saw with oil in the 1970s,” Leavens said.

Regional power challenges highlight copper demand in data centers

Data centers are highly concentrated in a few states in the US and, according to Leavens, this factors into the problem from an energy demand perspective, as they’re not spread out evenly. With higher concentrations of data centers comes an increase in electricity and its cost, and power availability is only part of the picture, as the areas must also be attached to the internet pipeline, which much of the Midwest is not.

“You want to be close to a power source and close to the pipeline, and so that’s very limiting,” Leavens said.

“Electricity prices in the US have been hitting the headlines, with some laying the blame at the feet of the AI rollout and data center construction,” Olivia Cross [capitaleconomics.com], an economist with Capital Economics Climate and Commodities [capitaleconomics.com], wrote in a report on September 3.

But despite the headlines, Cross said that currently there is “little evidence that data centres are pushing up electricity prices in a meaningful way,” with the hotspot states, such as Virginia, that Leavens mentioned not yet experiencing higher electricity prices or faster-than-average price increases compared with the rest of the country.

But “the lack of a clear relationship between data centers and electricity inflation is probably because one of the criteria determining where data centers are built is access to cheap electricity, often due to plentiful supply in local power markets,” Cross wrote.

“As the rapid construction of new data centers eats away at slack in power markets, upward pressure on electricity prices could become more noticeable,” she said.

But one thing is for sure: right now in the US, the rise of AI and the demand it puts on data centers, electricity and copper, is only going up.

Access a sample of the Fastmarkets copper long-term forecast [fastmarkets.com]

Journal Reference:
Just a moment..., (DOI: https://dl.acm.org/doi/10.1145/3630106.3658542 [doi.org])


Original Submission