Canadian uranium developer NexGen Energy has held preliminary talks with data centre providers about securing finance for a new mine that could supply fuel for power plants needed for artificial intelligence, its CEO said on Wednesday:
Soaring demand for AI is driving a massive build-out of power-hungry data centres, in turn boosting the need for new generation capacity, including nuclear plants that will require uranium.
To meet that need, NexGen CEO Leigh Curyer said big tech firms will follow the trend set by automakers, who offered finance for battery material mine development several years ago to ensure there was enough supply for an expected boom in demand for electric vehicles.
"It's coming. You've seen it with automakers. These tech companies, they're under an obligation to ensure the hundreds of billions that they are investing in the data centres are going to be powered," he said, speaking at a Melbourne Mining Club event.
NexGen is developing its Rook 1 uranium project in Saskatchewan and has said it expects to finalise a funding package in the second quarter.
As reported on OilPrice.com:
Global electricity demand increased by 3% annually in 2025, following growth of 4.4% in 2024, the International Energy Agency (IEA) said in its recent Electricity 2026 report.
Between 2026 and 2030, the annual average growth rate would be 3.6%, driven by higher consumption from industry, electric vehicles (EVs), air conditioning, and data centers, according to the agency.
Artificial intelligence, data centers, and advanced manufacturing support the return to growth in power demand in advanced economies, the IEA said.
U.S. electricity demand rose by 2.1% in 2025 and is expected to grow by nearly 2% annually through 2030. The rapid expansion of data centers will drive half of the increase, the agency noted.
Also at ZeroHedge.
Related:
- Microsoft: the Company Doesn't Have Enough Electricity to Install All the AI GPUs in its Inventory
- Data Centers Turn to Commercial Aircraft Jet Engines Bolted Onto Trailers as AI Power Crunch Bites
- OpenAI and Nvidia's $100B AI Plan Will Require Power Equal to 10 Nuclear Reactors
- Meta is Building "Several" Multi-Gigawatt Compute Clusters
« Password Managers Less Secure Than Promised | Texas Sues Wi-Fi Router Maker Over Alleged China Links »
Related Stories
Endgadget reports that Meta is Building "Several" Multi-Gigawatt Compute Clusters
Meta is building several gigawatt-sized data centers to power AI, as reported by Bloomberg. CEO Mark Zuckerberg says the company will spend "hundreds of billions of dollars" to accomplish this feat, with an aim of creating "superintelligence."
The first center is called Prometheus and it comes online next year. It's being built in Ohio. Next up, there's a data center called Hyperion that's almost the size of Manhattan. This one should "be able to scale up to 5GW over several years." Some of these campuses will be among the largest in the world, as most data centers can only generate hundreds of megawatts of capacity.
Meta has also been staffing up its Superintelligence Labs team, recruiting folks from OpenAI, Google's DeepMind and others. Scale AI's co-founder Alexandr Wang is heading up this effort.
However, these giant data centers do not exist in a vacuum. The complexes typically brush up against local communities. The centers are not only power hogs, but also water hogs. The New York Times just published a report on how Meta data centers impact local water supplies.
"This is a giant project," Nvidia CEO said of new 10-gigawatt AI infrastructure deal:
On Monday, OpenAI and Nvidia jointly announced a letter of intent for a strategic partnership to deploy at least 10 gigawatts of Nvidia systems for OpenAI's AI infrastructure, with Nvidia planning to invest up to $100 billion as the systems roll out. The companies said the first gigawatt of Nvidia systems will come online in the second half of 2026 using Nvidia's Vera Rubin platform.
"Everything starts with compute," said Sam Altman, CEO of OpenAI, in the announcement. "Compute infrastructure will be the basis for the economy of the future, and we will utilize what we're building with NVIDIA to both create new AI breakthroughs and empower people and businesses with them at scale."
The 10-gigawatt project represents an astoundingly ambitious and as-yet-unproven scale for AI infrastructure. Nvidia CEO Jensen Huang told CNBC that the planned 10 gigawatts equals the power consumption of between 4 million and 5 million graphics processing units, which matches the company's total GPU shipments for this year and doubles last year's volume. "This is a giant project," Huang said in an interview alongside Altman and OpenAI President Greg Brockman.
To put that power demand in perspective, 10 gigawatts equals the output of roughly 10 nuclear reactors, which typically output about 1 gigawatt per facility. Current data center energy consumption ranges from 10 megawatts to 1 gigawatt, with most large facilities consuming between 50 and 100 megawatts. OpenAI's planned infrastructure would dwarf existing installations, requiring as much electricity as multiple major cities.
[...] Bryn Talkington, managing partner at Requisite Capital Management, noted the circular nature of the investment structure to CNBC. "Nvidia invests $100 billion in OpenAI, which then OpenAI turns back and gives it back to Nvidia," Talkington told CNBC. "I feel like this is going to be very virtuous for Jensen."
Cast-off turbines generate up to 48 MW of electricity apiece:
Faced with multi-year delays to secure grid power, US data center operators are deploying aeroderivative gas turbines — effectively retired commercial aircraft engines bolted into trailers — to keep AI infrastructure online.
According to IEEE Spectrum, facilities in Texas are already spinning up units based on General Electric's CF6-80C2 and LM6000, the same turbine cores once found on 767s and Airbus A310s. Vendors like ProEnergy and Mitsubishi Power have turned these into modular, fast-start generators capable of delivering 48 megawatts apiece, enough to support a large AI cluster while utility-scale infrastructure lags.
Fast, loud, and anything but elegant, these "bridging power" units come from vendors like ProEnergy, which offers trailerized turbines built around ex-aviation cores that can spin up in minutes to meet energy demand. Meanwhile, Mitsubishi Power's FT8 MOBILEPAC, which derives from Pratt & Whitney jet engines, delivers a similar output in a self-contained footprint designed for fast deployment.
While this might not be the cheapest, and certainly not the cleanest, way to power racks, it's a viable stopgap for companies racing to hit AI milestones while local substations and modular nuclear power deployments remain years away.
[...] In one of the more visible examples, OpenAI's parent group is deploying nearly 30 LM2500XPRESS units at a facility near Abilene, Texas, as part of its multi-billion-dollar Stargate project. Each unit spins up to 34 megawatts, fast enough to cold-start servers in under ten minutes.
Also see: Data Centers Look to Old Airplane Engines for Power
Microsoft: the Company Doesn't Have Enough Electricity to Install All the AI GPUs in its Inventory
Microsoft CEO Satya Nadella said during an interview alongside OpenAI CEO Sam Altman that the problem in the AI industry is not an excess supply of compute, but rather a lack of power to accommodate all those GPUs. In fact, Nadella said that the company currently has a problem of not having enough power to plug in some of the AI GPUs the firm has in inventory. He said this on YouTube in response to Brad Gerstner, the host of Bg2 Pod, when asked whether Nadella and Altman agreed with Nvidia CEO Jensen Huang, who said there is no chance of a compute glut in the next two to three years.
"I think the cycles of demand and supply in this particular case, you can't really predict, right? The point is: what's the secular trend? The secular trend is what Sam (OpenAI CEO) said, which is, at the end of the day, because quite frankly, the biggest issue we are now having is not a compute glut, but it's power — it's sort of the ability to get the builds done fast enough close to power," Satya said in the podcast. "So, if you can't do that, you may actually have a bunch of chips sitting in inventory that I can't plug in. In fact, that is my problem today. It's not a supply issue of chips; it's actually the fact that I don't have warm shells to plug into." [Emphasis added]
Nadella's mention of 'shells' refers to a data center shell, which is effectively an empty building with all of the necessary ingredients, such as power and water, needed to immediately begin production.
AI's power consumption has been a topic many experts have discussed since last year. This came to the forefront as soon as Nvidia fixed the GPU shortage, and many tech companies are now investing in research in small modular nuclear reactors to help scale their power sources as they build increasingly large data centers.
This has already caused consumer energy bills to skyrocket, showing how the AI infrastructure being built out is negatively affecting the average American. OpenAI has even called on the federal government to build 100 gigawatts of power generation annually, saying that it's a strategic asset in the U.S.'s push for supremacy in its AI race with China. This comes after some experts said Beijing is miles ahead in electricity supply due to its massive investments in hydropower and nuclear power.
Aside from the lack of power, they also discussed the possibility of more advanced consumer hardware hitting the market. "Someday, we will make a[n] incredible consumer device that can run a GPT-5 or GPT-6-capable model completely locally at a low power draw — and this is like so hard to wrap my head around," Altman said. Gerstner then commented, "That will be incredible, and that's the type of thing that scares some of the people who are building, obviously, these large, centralized compute stacks."
This highlights another risk that companies must bear as they bet billions of dollars on massive AI data centers. While you would still need the infrastructure to train new models, the data center demand that many estimate will come from the widespread use of AI might not materialize if semiconductor advancements enable us to run them locally.
This could hasten the popping of the AI bubble, which some experts like Pat Gelsinger say is still several years away. But if and when that happens, we will be in for a shock as even non-tech companies would be hit by this collapse, exposing nearly $20 trillion in market cap.
(Score: 0) by Anonymous Coward on Friday February 20, @09:06AM (2 children)
Who will just sell to the tech bros in the Land of Lunacy to the south???
(Score: 5, Insightful) by janrinok on Friday February 20, @09:16AM (1 child)
I don't think that many Americans have realised where their power, water, aluminium, uranium and rare earth metals come from. They think that they still hold 'all the cards', whereas they actually have a handful of Jokers.
[nostyle RIP 06 May 2025]
(Score: 5, Funny) by c0lo on Friday February 20, @09:33AM
Do they think the joker is a trump? I hear the reverse for a while.
https://www.youtube.com/@ProfSteveKeen https://soylentnews.org/~MichaelDavidCrawford
(Score: 2) by looorg on Friday February 20, @06:54PM
I for one welcome our new radioactive AI overlords!
(Score: 3, Insightful) by Deep Blue on Friday February 20, @10:44PM
Make the data centers build their own power sources that produce more than they use and pay for the rest of the power for everyone. This is the one chance to get free electricity for consumers atleast. If the data centers are supposedly such a good thing for the world, then make them atleast do some direct good instead of just raising people's electricity bills.
(Score: 4, Interesting) by PinkyGigglebrain on Saturday February 21, @04:58AM
Just want to point out that Thorium can be used in place of Uranium in current solid fueled LWTR reactors.
An advantages is that the resulting fission byproducts waste have shorter half-life times than Uranium and Plutonium based fuels.
Another plus is currently Thorium is considered more of a waste product of Rare Earth metal mining that is already being mined. A single average Rare Earth mine also pulls out enough Thorium as tailings every year to power the ENTIRE world's needs for that year.
Most mine sites already have years worth of Thorium already mined.
"Beware those who would deny you Knowledge, For in their hearts they dream themselves your Master."