Table of Contents
The grid just failed its first test
For the first time in its 97-year history, PJM Interconnection — the grid operator that serves 65 million people across 13 states and the District of Columbia — failed to procure enough power to meet reliability targets. The December 2025 capacity auction fell 6,625 megawatts short of what PJM needs to keep the lights on in summer 2027. The reserve margin landed at 14.8 percent against a required 20 percent. Capacity prices hit the $333.44 per megawatt-day price cap — a record high for the third consecutive auction — and the total capacity bill reached $16.4 billion, up from $2.2 billion just three years earlier. By 2028, an average family in PJM territory could see monthly electricity bills rise by roughly $70. The cause of this grid-level failure is not a mystery. Data centers account for 94 percent of projected load growth in the PJM footprint. The AI industry’s insatiable demand for electricity is no longer a future problem. It is a present crisis — one that is showing up in utility bills, grid reliability margins, and the politics of eleven state legislatures considering moratoriums on new data center construction.
The International Energy Agency projects that global data center electricity consumption will exceed 1,000 terawatt-hours by the end of 2026 — an amount equivalent to Japan’s entire annual electricity usage. There are 550 planned data center projects totaling 125 gigawatts in the global pipeline. Amazon alone added 3.9 gigawatts of power capacity in 2025 and plans to double its total capacity by the end of 2027. Meta is funding natural gas plants and transmission infrastructure in Louisiana. Google signed a 1,900-megawatt clean energy deal with Xcel Energy in Minnesota. The hyperscalers are collectively spending $700 billion on AI infrastructure this year, and a significant fraction of that spend goes directly into power procurement, grid interconnection, and generation capacity. Retail electricity prices have risen 42 percent since 2019, outpacing the 29 percent increase in the Consumer Price Index over the same period. The gap between energy cost inflation and general inflation is widening, and data centers are not the only cause — but they are the largest single driver of incremental electricity demand growth in the United States.
The collision between AI’s power appetite and the physical limits of the electrical grid is the story that connects every other AI infrastructure story on the blog this month. Amazon’s $200 billion capex depends on securing power for the data centers it is building. CoreWeave’s $35 billion Meta deal depends on having racks plugged into a grid that can serve them. NVIDIA’s Ising quantum models require GPU clusters that consume megawatts per facility. None of it works without electrons flowing through copper wire. And the copper wire is running out of capacity faster than anyone predicted.
The problem is no longer speculative. PJM’s auction failure is a concrete, measurable event with concrete, measurable consequences. Summer 2027 will be the first time in PJM’s history that the grid operator expects to not have enough power to reliably meet demand. The shortfall is driven by two converging forces: new data center load that has appeared faster than generation capacity can be built, and the ongoing retirement of fossil fuel plants that have not been replaced by renewables or nuclear at equivalent scale. The AI industry did not create the generation retirement problem. But it poured gasoline on a smoldering fire by adding tens of gigawatts of new demand to a grid that was already struggling to replace what it was losing.
The $16.4 billion bill nobody agreed to pay
The mechanics of how AI’s power demand translates into household electricity bills deserve scrutiny because the costs are both enormous and structurally hidden from public debate. PJM operates a capacity market — an auction where power generators bid to provide electricity in future delivery years. The auction is supposed to ensure that enough generation capacity exists to meet peak demand with a reliability margin. When demand grows faster than generation capacity, auction prices rise. Those higher prices are passed through to retail customers via their utility bills. The December 2025 auction results — $333.44 per megawatt-day at the cap, $16.4 billion in total charges — represent the single largest cost increase in PJM’s capacity market history. The 9.3x increase in standby payments over the prior year is being passed directly to households, businesses, and industrial users across PJM’s thirteen-state territory.
The distributional impact is regressive. Data centers are operated by the wealthiest companies on earth — Amazon, Microsoft, Google, Meta — whose combined market capitalizations exceed $10 trillion. The households bearing the cost increases have median incomes of roughly $70,000 in PJM territory. The Federal Reserve Bank of Dallas has projected that wholesale power prices could rise up to 50 percent as data center electricity demand doubles within five years. A 50 percent wholesale increase, flowing through retail rate structures, would add approximately $100 to $150 per month to a typical household bill — more than the monthly cost of most Americans’ streaming subscriptions combined. The AI industry is externalizing its infrastructure costs onto residential ratepayers who never agreed to subsidize the training of large language models and who derive uncertain benefits from doing so.
The White House attempted to address this dynamic on March 4, 2026, when Microsoft, Meta, OpenAI, and Amazon signed a Ratepayer Protection Pledge committing to avoid actions that would increase residential electricity costs. The pledge is voluntary, has no enforcement mechanism, and has not prevented any of the cost increases already flowing through PJM’s capacity market. It is the policy equivalent of a press release — good optics with no operational teeth. The structural problem is not that individual companies are behaving badly. It is that the grid was designed for a world where electricity demand grew at 1 to 2 percent per year, and it is now facing demand growth driven by a single industry that requires tens of gigawatts of continuous power in concentrated geographic areas. The grid cannot adapt at the speed the AI industry requires because building new transmission infrastructure takes five to ten years, upgrading substations takes three to five years, and permitting new generation capacity takes two to four years. The AI buildout timeline is measured in quarters. The grid timeline is measured in decades.
Current annual utility transmission investments of approximately $35 billion fall significantly short of what is needed to support doubled or tripled electricity demand over 25 years. The infrastructure gap is not a funding problem — the capital is available from both utilities and the hyperscalers themselves. It is a permitting and construction speed problem. You cannot lay a 500-kilovolt transmission line from a new natural gas plant to a new data center in eighteen months, regardless of how much money is on the table. The physical constraints of the built environment are imposing a speed limit on AI infrastructure deployment that no amount of $200 billion capex budgets can override.
Here is the original quantified insight that no single source provides: combining PJM’s $16.4 billion capacity bill with the 94 percent data center share of load growth implies that data centers are responsible for approximately $15.4 billion of the capacity cost increase that households are now paying. Divided across PJM’s 65 million served population, that is roughly $237 per person per year — or $948 per four-person household — in hidden AI infrastructure costs embedded in electricity bills. This is not a subsidy that any legislature authorized, any ratepayer consented to, or any utility explicitly disclosed. It is the emergent result of a capacity market designed for stable demand growth being overwhelmed by a single industry’s exponential power consumption.
Eleven states said stop. One may mean it.
The political backlash has arrived faster than even skeptics predicted. At least eleven states have introduced moratorium bills targeting new data center construction in 2026, and one may become the first in the nation to enact a statewide ban. Maine’s LD 307 passed both the House and Senate on April 6-7 and would impose a moratorium on data center facilities exceeding 20 megawatts until November 2027. If signed by the governor, Maine would become the first state in U.S. history to ban data center construction outright, even temporarily. Georgia, Maryland, Michigan, Minnesota, South Carolina, Vermont, Virginia, and Wisconsin have filed similar proposals. At the federal level, Senator Bernie Sanders and Representative Alexandria Ocasio-Cortez introduced the AI Data Center Moratorium Act, which would pause data center development nationally. More than 30 states have proposed or implemented additional charges for large-load customers. “No data centers” signs have appeared in communities from Valley View Estates, Pennsylvania to rural Virginia.
The moratorium movement represents a genuine threat to the AI industry’s growth trajectory. The hyperscalers’ capex plans assume they can build where the power exists. If the states where power exists refuse to permit construction, the entire infrastructure buildout faces delays measured in years, not months. Virginia — the world’s largest data center market — has already seen legislative proposals to slow growth. Pennsylvania, where PJM is headquartered and where the grid shortfall is most acute, faces the same political pressure. The industry’s response has been to pursue “behind-the-meter” generation — building private power plants on-site at data centers rather than drawing from the shared grid — but this approach has its own constraints. Behind-the-meter natural gas plants face environmental opposition, and the clash between on-grid and off-grid power strategies is fracturing the industry’s relationship with utilities that previously welcomed data center load growth as revenue.
The counterargument from the AI industry is that moratoriums will push data center construction to less regulated jurisdictions — potentially overseas — without reducing global electricity demand. ITIF, the technology-friendly think tank, published a rebuttal in April arguing that data centers will not overwhelm the grid, citing efficiency improvements, demand management, and the fact that only a third of planned capacity is actually being built. The efficiency argument has historical support: data center power usage effectiveness has improved from 2.0 in 2010 to roughly 1.2 in 2026, meaning each watt of compute now requires far less cooling and overhead power. But efficiency gains cannot outpace the exponential growth in total compute demand. Training a single frontier model in 2026 requires orders of magnitude more compute than training one in 2023, and the number of models being trained has multiplied alongside the compute per model. Efficiency buys time. It does not solve the fundamental mismatch between AI’s power curve and the grid’s capacity curve.
The deeper structural question is whether the AI industry can decouple its growth from grid dependency before the political constraints become binding. Nuclear — both traditional and next-generation small modular reactors — is the AI industry’s favorite long-term answer. Amazon, Google, and Microsoft have all announced nuclear power procurement strategies. But no new nuclear reactor has been built in the United States in decades, and the permitting timeline for SMRs stretches to the early 2030s. The AI industry needs power now, and the only sources available now are natural gas and grid electricity — both of which carry political and environmental costs that are increasingly unacceptable to the communities where data centers are being built. The irony is palpable: the companies that market themselves as leaders in clean energy procurement and sustainability reporting are simultaneously driving the largest increase in natural gas generation demand in a decade because no other power source can deliver the megawatts they need on the timeline they require. The sustainability reports and the capacity auction results tell irreconcilable stories, and the communities hosting these facilities know which one to believe.
The data center moratorium bills represent a political judgment that the costs of AI infrastructure growth are being borne disproportionately by communities that receive little of the benefit. A data center employs 30 to 50 permanent workers. It consumes as much electricity as a small city. It generates tax revenue that varies widely depending on state and local incentive structures — some jurisdictions have offered ten-year tax abatements that leave communities absorbing infrastructure costs while receiving minimal fiscal benefit. The moratorium movement is not anti-technology. It is anti-subsidy: communities are rejecting the implicit transfer of wealth from residential ratepayers to trillion-dollar technology companies that could afford to pay the full cost of their power consumption but choose not to.
The operator calculus when power becomes the bottleneck
The AI infrastructure buildout is entering a new phase where the binding constraint is not compute, not capital, and not talent. It is power. Every operator in the AI ecosystem — from hyperscalers to startups, from GPU cloud providers to enterprise customers — needs to internalize that the electricity supply chain is now the critical path for AI scaling. The companies that secure power access first will build the facilities that serve the next decade’s AI workloads. The companies that assume power will be available on demand will discover, as PJM’s auction results demonstrate, that demand has outrun supply.
The implications cascade through every layer of the stack. Cloud providers pricing AI inference at competitive rates must now factor in electricity costs that are rising 10 to 15 percent annually in the highest-demand regions. Startups choosing data center locations must evaluate not just lease costs and network latency but grid capacity margins and regulatory risk in each jurisdiction. Enterprise customers evaluating multi-year cloud commitments should ask their providers where their power comes from and whether their capacity is contracted or spot. The era of abundant, cheap, invisible electricity for cloud computing is ending, and the costs that replace it will be visible in every cloud bill, every inference API call, and every household utility statement.
For operators navigating this power-constrained landscape, the framework is clear:
- Map your power chain. Every AI workload runs on electricity. Know where your cloud provider’s data centers are located, which grid operator serves them, and whether the region faces capacity shortfalls. PJM’s 6,625 MW deficit is the most acute, but ERCOT (Texas), CAISO (California), and MISO (Midwest) all face their own constraints.
- Evaluate moratorium risk in your deployment regions. Eleven states have introduced data center moratorium bills. If your AI infrastructure depends on building or expanding in any of those states, model the delay scenario. Maine’s LD 307, if signed, creates immediate precedent for other states to follow.
- Price electricity cost escalation into your unit economics. The Federal Reserve Bank of Dallas projects wholesale power prices rising up to 50 percent as data center demand doubles. If your AI product’s gross margin assumes stable electricity costs, recalculate now. The $200 billion capex cycle is not just about hardware procurement — it is about power procurement, and power is getting more expensive every quarter.
- Investigate behind-the-meter and on-site generation options. The hyperscalers are already building private natural gas plants and procuring nuclear power purchase agreements. Smaller operators should evaluate co-generation, battery storage, and renewable energy contracts that reduce grid dependency and insulate against capacity market price spikes.
- Engage with state and local energy policy. The data center moratorium movement is driven by legitimate ratepayer concerns. AI companies that engage constructively with communities — offering rate protections, infrastructure contributions, and transparent energy commitments — will face less political resistance than companies that announce billion-dollar data centers without addressing the $70-per-month bill increase their neighbors will absorb.
The AI power crisis is not an externality that can be hand-waved away in shareholder letters or an obstacle that can be solved with money alone — not even $200 billion of it. It is a structural constraint that will shape where AI infrastructure gets built, how quickly it scales, and who pays the costs of the buildout. PJM’s first-ever failure to procure enough power is not an anomaly. It is a signal — the first in what will become a recurring pattern as AI demand collides with a grid built for a lower-energy world. The hyperscalers that write $200 billion capex checks and the households that pay $70 more per month for electricity are connected by the same copper wire. The question of who bears the cost of AI’s power appetite is no longer theoretical. It is arriving in mailboxes, on utility bills, across thirteen states and counting. And the answer — that ordinary households subsidize the infrastructure buildout of the wealthiest industry in human history — is not one that any democracy will accept indefinitely.
In other news
Maine poised to become first state to enact data center moratorium — Maine’s LD 307 passed both legislative chambers on April 6-7, banning data center facilities over 20 MW until November 2027. If signed by the governor, Maine would be the first state in U.S. history to impose a statewide data center construction moratorium, with at least four other states poised to follow.
Sanders and Ocasio-Cortez introduce federal AI Data Center Moratorium Act — The federal bill would pause data center development nationally, citing ratepayer impacts and environmental concerns. The proposal faces long odds in the current Congress but signals that the energy politics of AI infrastructure are reaching the federal level.
Apptronik raises $520 million to beat Tesla Optimus to market — Humanoid robot startup Apptronik closed a $520 million round at a $5 billion valuation, aiming to commercialize its Apollo robot before Tesla’s Optimus and Chinese competitors. Q1 2026 saw robotics startups secure over $2.26 billion in funding, with 70 percent flowing to warehouse and industrial automation.
Figure AI reportedly in talks for $1.5 billion at $39.5 billion valuation — Humanoid robotics company Figure AI is seeking $1.5 billion in new funding that would value the company at $39.5 billion, as it pushes its Figure 03 toward pilot deployments in commercial sites. Q1 2026 saw robotics startups secure over $2.26 billion in funding, with the humanoid sector projected to draw $20 billion or more this year.