
Nvidia’s $26B AI bet collides with fragile grids
Nvidia’s $26B push into open‑weight AI and gigawatt‑scale deals like Thinking Machines Lab’s pact expose a new choke point: power, water and grid access.
Nvidia’s plan to pour $26 billion into building its own family of open‑weight AI models, revealed in recent securities filings and detailed by Wired, lands just as power grids and water systems start to creak under the weight of the AI boom. At the same time, Mira Murati’s Thinking Machines Lab has locked in at least a gigawatt of Nvidia’s next‑generation Vera Rubin systems in a multiyear deal, putting a single AI startup on par with the electricity draw of hundreds of thousands of homes.
According to Wired, Nvidia intends to spend roughly $26 billion over several years to develop open‑weight models it can license broadly, positioning itself not just as the chip supplier to AI companies but as a frontline model provider in its own right. That model push rides on an infrastructure buildout that is already staggering in scale: a gigawatt is roughly enough power for 750,000 U.S. homes, a comparison used in Bloomberg’s coverage of the Thinking Machines deal, and the companies say the Rubin systems will start coming online early next year.
Compute as a power and water problem
Those numbers are no longer abstractions. The Biden administration’s January 2025 executive order on AI data centers directed federal agencies to support large AI facilities on federal land and study their impact on energy prices and reliability, while encouraging renewable power, as summarized in coverage collated on Wikipedia and a later AP report. The Department of Energy has since identified 16 sites, including Los Alamos and Oak Ridge, as prime locations for co‑located data centers and new generation that can be fast‑tracked through permitting, AP News reported.
Regulators are also rewriting grid rules to accommodate AI. A recent unanimous order by the U.S. Federal Energy Regulatory Commission (FERC) is designed to speed up so‑called colocation deals that let data centers plug directly into power plants in the country’s largest grid territory, from the mid‑Atlantic to the Midwest, according to AP News. Analysts at policy outlet Techno Statecraft warn that the cumulative effect is a grid that increasingly defaults to serving large corporate AI customers first, leaving households and smaller businesses to absorb more of the cost and reliability risk.
Water is emerging as a second invisible constraint. A recent academic study on data‑center water use found that, if current patterns persist, U.S. facilities could require between 697 million and 1,451 million gallons of new water capacity per day by 2030 — comparable to New York City’s average daily water supply — with AI‑heavy workloads a major driver, according to researchers on arXiv. During Europe’s 2022 heatwave, UK operators resorted to hosepipes to keep facilities cool as temperatures topped 40°C, an early glimpse of what more frequent heat extremes could mean for local water systems, as reported by Data Center Dynamics.
Governments pick winners on grid access
Governments are now explicitly prioritizing AI data centers over other uses of finite grid capacity. In the UK, the Labour government’s AI Growth Zones will receive fast‑tracked planning approvals and “priority access to any available energy grid capacity,” effectively letting AI hubs jump the queue ahead of housing and smaller industry, according to Computer Weekly and earlier Engadget reporting. In the U.S., a 2024 letter from Senators Sheldon Whitehouse and colleagues warned the White House against any move that would give AI data centers priority over “households and other industries” on the grid, citing potential violations of clean air and water standards, as documented on Sen. Whitehouse’s site.
The construction sector is already feeling the pinch. A recent analysis from environmental policy outlet EnviroLink reports that fast‑track rules and grid priority for AI facilities are making it harder to connect new residential developments in some regions, as utilities allocate scarce capacity to lucrative hyperscale deals.
For Nvidia, the convergence of these forces is strategic as much as technical. The $26 billion open‑weight push will lock in demand for its chips and cloud partners just as grid‑connected, gigawatt‑scale campuses become the gating factor for which labs can train the next wave of frontier models, Wired argues. That raises a blunt question for policymakers: who gets to decide which models are worth a city’s worth of power and water — and on what terms communities, not just shareholders, get a say.
Tags
