The numbers stopped making sense about a year ago.

Microsoft signed a deal to restart Three Mile Island's nuclear reactor—the same Three Mile Island famous for America's worst nuclear accident—specifically to power AI data centers. Amazon is buying nuclear-powered data campuses. Google's energy consumption grew 48% in two years. OpenAI reportedly uses more electricity than some small countries.

Welcome to the AI boom's dirty secret: it's an energy crisis wearing a software costume.

The Scale of the Problem

Training GPT-4 consumed an estimated 50 gigawatt-hours of electricity. That's roughly what 4,600 American homes use in a year—for a single training run. GPT-5 is reportedly 10x larger. The inference costs (actually running the models) are even worse at scale.

The International Energy Agency projects data center electricity demand will double by 2026. Goldman Sachs estimates AI alone will drive a 160% increase in data center power demand by 2030. These aren't fringe predictions—they're consensus estimates that keep getting revised upward.

The kicker: we're still in the early innings. Current AI usage is a rounding error compared to what happens when every business process gets an AI layer.

The Infrastructure Arms Race

This is why hyperscalers are panicking about power, not compute. Nvidia can ship more GPUs, but you can't ship more electricity.

The response has been a land grab for energy assets. Microsoft's Three Mile Island deal. Amazon's acquisition of a nuclear-powered data center campus from Talen Energy. Google's investments in geothermal and advanced nuclear. These aren't PR exercises—they're survival moves.

The geographic implications are profound. Data centers are migrating toward cheap, abundant power. Northern Virginia—the traditional data center hub—is hitting grid capacity limits. New facilities are sprouting in Texas, Ohio, and anywhere with surplus generation capacity.

Countries with cheap energy are suddenly AI infrastructure plays. The Middle East is building AI hubs powered by natural gas. Nordics offer cheap hydro. Even Paraguay—with its surplus Itaipu dam power—is attracting crypto and AI investment.

The Startup Angle

If you're a founder, here's what this means for you:

Cloud costs are going up. The hyperscalers will pass energy costs through, and inference pricing will rise accordingly. Build your unit economics assuming AI API costs increase, not decrease.

Efficiency is a competitive advantage. Models that achieve similar results with less compute win. Techniques like quantization, distillation, and efficient inference aren't just nice-to-haves—they're strategic necessities.

Infrastructure plays are emerging. Energy arbitrage, cooling technology, and power grid optimization are suddenly venture-scale opportunities. Companies like Crusoe Energy (using stranded natural gas to power AI training) represent a new category that didn't exist five years ago.

The Environmental Elephant

Let's talk about the part that makes everyone uncomfortable: AI's carbon footprint is massive and growing.

Yes, the hyperscalers have net-zero commitments. Yes, they're investing heavily in renewables. But the gap between AI's energy demand growth and renewable buildout is widening, not narrowing. In practice, marginal AI compute is often powered by natural gas—the dirtiest scalable option available.

The PR narrative says "AI will help solve climate change through optimization." Maybe. But right now, AI is a significant contributor to the problem it's supposed to solve.

This creates real risk for AI companies. Regulatory pressure on energy-intensive industries is increasing. Carbon pricing is expanding globally. A future where AI training requires carbon offset purchases—adding materially to costs—isn't far-fetched.

The Water Problem

Energy gets the headlines, but water might be the harder constraint.

Data centers require massive cooling, and evaporative cooling—the most efficient method—consumes enormous quantities of water. Microsoft reportedly uses about 700,000 gallons of water to train GPT-3. GPT-4 training required substantially more.

In water-stressed regions like the American Southwest, data center expansion is running into local opposition. Some facilities are switching to air cooling, which works but uses more energy—creating a nasty tradeoff.

What Happens Next

The optimistic scenario: efficiency improvements outpace demand growth. New model architectures (like Mixture of Experts) deliver better results with less compute. Nuclear renaissance provides abundant clean power. Water-efficient cooling becomes standard.

The pessimistic scenario: demand growth wins. AI becomes a major strain on power grids, pushing electricity prices up for everyone. Water conflicts intensify. Carbon footprints balloon despite green pledges. Regulatory backlash forces constraints on AI development.

The likely scenario: somewhere in between. AI infrastructure buildout continues aggressively, with genuine progress on efficiency but not enough to offset demand. Energy becomes an ongoing constraint that shapes which companies can compete and which can't.

The AI boom is eating everything—including the grid that powers it. Founders who understand this aren't just building better models. They're building businesses that can survive when power becomes the limiting factor.

Because it will.