Economy

America’s Next Industrial Geography: Wherever the Power Is Cheap

Pinterest LinkedIn Tumblr

On Capitol Hill this week, five Democratic senators accused the Trump administration of “sweetheart deals with Big Tech” that have “driven up power bills for ordinary Americans.” 

Their letter, addressed to the White House, faulted the administration for allowing data-center operators to consume “massive new volumes of electricity without sufficient safeguards for consumers or the climate.”

But the senators’ complaint points to a deeper reality neither party can ignore: artificial intelligence is changing America’s energy economy faster than policy can adapt. Every conversation with ChatGPT, every AI-generated image, every search query now runs through vast new physical infrastructure — data centers — that consume more electricity than some nations. 

The world’s appetite for digital intelligence is colliding with its appetite for cheap, reliable power. 

A New Industrial Landscape 

The anonymous-looking gray boxes—bigger than football fields—rising across Virginia, Texas, and the Arizona desert look like nothing special from the highway. Inside, however, they house the machinery of the new economy: tens of thousands of high-end processors performing trillions of calculations per second. These are the “intelligence factories,” where neural networks are trained, deployed, and refined — and where America’s energy system is pushed to its limits and beyond. 

“People talk about the cloud as if it were ethereal,” energy analyst Jason Bordoff said recently. “But it’s as physical as a steel mill — and it runs on megawatts.” 

According to the Pew Research Center, US data centers consumed about 183 terawatt-hours (TWh) of electricity in 2024 — some 4 percent of total US power use, and about the same as Pakistan. By 2030, that figure could exceed 426 TWh, more than double today’s level. The International Energy Agency (IEA) warns that, worldwide, data-center electricity demand will double again by 2026, growing four times faster than total global power demand. 

The driver is artificial intelligence. Training and running large language models (LLMs) like ChatGPT and other models requires enormous computing clusters powered by specialized chips — notably Nvidia’s graphics processing units (GPUs). Each new generation of AI systems multiplies power requirements. OpenAI’s GPT-4 reportedly demanded tens of millions of dollars’ worth of electricity just to train. Multiply that by hundreds of companies now racing to build their own AI models, and the implications for the grid are staggering. 

Where the Power Is Going 

The American and global epicenter (for now) of this new build-out remains Loudoun County, Virginia — nicknamed “Data Center Alley” — where nearly 30 percent of that county’s electricity now flows to data facilities. Virginia’s utilities estimate that data centers consume more than a quarter of the whole state’s total generation.

Elsewhere in America, the story is similar. Microsoft’s burgeoning data center complex near Des Moines has forced MidAmerican Energy to accelerate new natural-gas generation. Arizona Public Service now plans to build new substations near Phoenix to serve a cluster of AI facilities; Texas grid operator ERCOT says data centers will add 3 gigawatts of demand by 2027. 

And the trend, by the way, isn’t limited to electricity. Most facilities require water for cooling. A single “hyperscale” campus can use billions of gallons per year, prompting local backlash in drought-prone regions.

The Political Blame Game 

Soaring demand has begun to translate into electric-rate filings. US utilities asked for $29 billion in rate increases in the first half of 2025, nearly double the total for the same period last year. Executives cite “data-center growth and grid reinforcement” as drivers. 

And so, we get the letter from Senate Democrats — among them Elizabeth Warren and Sheldon Whitehouse — urging the Department of Energy to impose “efficiency standards” and “consumer protections” before authorizing new power contracts for AI operators. “We cannot allow Silicon Valley’s hunger for compute to be fed by higher bills in the heartland,” they wrote. 

The Trump administration shot back a reply. Press Secretary Karoline Leavitt said, “The president will not let bureaucrats throttle America’s leadership in AI or its supply of affordable energy. If the choice is between progress and paralysis, he chooses progress.” 

That framing “progress versus paralysis” captures the larger divide. The administration has prioritized energy abundance, reopening leasing on federal lands, greenlighting LNG export terminals, rolling back environmental restrictions of all kinds, and signaling renewed support for coal and nuclear power. Democrats, fixated on climate commitments, have continued to oppose expanded drilling in Alaska’s Arctic and new offshore projects, while pressing for data centers to run on renewables. 

Powering the AI Boom 

Without continuous electricity, the AI boom falters. Nvidia, Microsoft, and OpenAI are already pushing the limits of available capacity. In April, Microsoft confirmed it will buy power from the planned restart of the Three Mile Island Unit 1 reactor — mothballed since 2019 — to feed its growing data-center fleet in Pennsylvania. “We’re essentially connecting a small city’s worth of demand to the grid,” said an energy executive involved in the project. “Data centers are an order of magnitude larger than anything we’ve built for before.” 

That “small city” reference is not an exaggeration. A single hyperscale facility can draw 100 megawatts — roughly the load of 80,000 households. Dozens of such projects are under construction. 

And while the industry’s largest players are also buying wind and solar power contracts, they admit that renewables alone cannot meet the 24-hour load. “When the model is training, you can’t tell it to pause because the sun set,” one data-center engineer quipped. 

The Economics of Constraint 

From an economic perspective, what matters is not only rising demand but constrained supply. Regulations restricting oil, gas, and pipeline development keep marginal electricity generation expensive. Permitting delays for transmission lines slows the build-out of new capacity. At the same time, federal subsidies distort investment toward intermittent sources that require backup generation — often natural gas — to stabilize the grid. 

A perfect storm of policy contradictions may be brewing: a government that wants both a carbon-neutral grid and dominance in energy-hungry AI. 

“The irony is that the very politicians demanding AI leadership are the ones making it harder to power,” said economist Stephen Moore. “You can’t have artificial intelligence without real energy.” 

In a free market, higher demand would spur rapid expansion of supply. Investors would drill, build, and innovate to capture new profit opportunities. Instead, production and permitting are politically constrained, so prices must rise until demand is choked off. That is the dynamic now visible in electricity bills — and in the Senate’s sudden search for someone to blame. 

The Global Race 

Complicating it all, to say the least, is the geopolitical dimension. China, the European Union, and the Gulf states are racing to build their own AI infrastructure. Beijing’s Ministry of Industry announced plans for 50 new “intelligent computing centers” by 2027, powered largely by coal. In the Middle East, sovereign wealth funds are backing data-center projects co-located with gas fields to guarantee cheap electricity. 

If the US restricts its own energy production, it risks ceding the field. “Energy is now the limiting reagent for AI,” venture capitalist Marc Andreessen wrote this summer. “Whichever country solves cheap, abundant power wins the century.”

That insight revives old debates about industrial policy. Should Washington subsidize domestic chip foundries and their power plants, or should it clear the regulatory thicket that deters private capital from building both? Innovation thrives on liberty, not mircomanagement. 

The New Factories 

Are data centers so different from factories of the industrial age? They convert raw inputs like electricity, silicon, cooling water, and capital into valuable outputs: trained models and real-time AI services. But unlike the factories of the past, they employ few workers directly. A billion-dollar hyperscale facility may have fewer than 200 staff. That does not sit well with the communities in which the vast data centers are located. The wealth is created upstream and downstream: in chip design, software, and the cascade of productivity gains AI enables. 

Still, the indirect productivity is vast. AI-driven logistics shave fuel costs, AI-assisted medicine accelerates diagnosis, and AI-powered coding tools raises output per worker. But all of it depends on those humming, appallingly noisy, heat-filled halls of servers. As OpenAI’s Sam Altman remarked last year, “A lot of the world gets covered in data centers over time.” 

If true, America’s next great industrial geography will not be steel towns or tech corridors, but the power corridor: regions anywhere that electricity is plentiful, cheap, and politically welcome. 

Already, states like Texas and Georgia are advertising low-cost energy as a lure for AI investment. 

Markets Versus Mandates 

From a free-market perspective, the lesson is straightforward. Economic growth follows energy freedom. When government treats energy as a controlled substance — rationed through regulation, taxed for vice, or distorted by subsidies — innovation slows. When markets are allowed to meet demand naturally, abundance results. 

In the early industrial age, the United States became the world’s workshop because it embraced abundance: of coal, oil, and later electricity. Every new machine and factory depended on those resources, and entrepreneurs supplied them without central direction. Today’s equivalent is the AI data center. Its prosperity depends on letting energy producers compete, invest, and innovate without political interference. 

Politics Ahead 

Over the next year, expect the power issue to dominate AI politics. Democrats will press for efficiency mandates and carbon targets; Republicans will frame energy freedom as essential to national strength. Federal officials already are discussing a kind of “clean AI” certification system tied to renewable sourcing — critics say that could amount to a de facto quota on computer power. 

Meanwhile, utilities are rethinking grid design for a world where data centers behave like factories that never sleep. The market is responding: small, modular nuclear reactors, advanced gas turbines, and geothermal projects are attracting venture funding as potential baseload sources for AI campuses. 

For policymakers, the challenge is to resist the urge to micromanage. As AIER’s scholarship often finds, spontaneous order, not centralized control, produces both efficiency and resilience. Allowing prices to signal scarcity and opportunity will attract the investment necessary to balance America’s energy equation.

The Freedom to Compute 

In the end, the debate over data centers and electricity bills is really about the freedom to compute. The same economic laws that governed the Industrial Revolution still apply: productivity rises when entrepreneurs can transform energy into work — whether mechanical or digital. 

Artificial intelligence may be virtual, but its foundations are unmistakably physical. To sustain the AI boom without bankrupting ratepayers, the United States must choose policies that unleash energy production rather than constrict it. 

The “cloud” will always have a power bill. The question is whether that bill becomes a burden of regulation or a dividend of freedom.