← Back to Live in the Future
⚡ Energy

Your Electricity Bill Went Up $70 a Month. The Money Went to AI Data Centers.

PJM capacity prices surged 833% in one year. Data centers drove 63% of the increase. 67 million ratepayers across 13 states are absorbing $10–16 billion per year in grid costs they never agreed to. Four companies chose this path. The ratepayers did not.

Aerial view showing a sprawling data center complex adjacent to a residential neighborhood, connected by high-voltage power lines at dusk

By Anya Volkov · Energy Systems · March 24, 2026 · ☕ 8 min read

Two hundred sixty-nine dollars and ninety-two cents per megawatt-day. That was the PJM Interconnection capacity market clearing price for the 2025–2026 delivery year, announced in July 2024. One year earlier, the same auction cleared at $28.92/MW-day. An 833% jump in 12 months, the largest single-year spike in the 27-year history of America's biggest electricity market.

PJM coordinates the grid for 67 million people across 13 states and the District of Columbia, stretching from New Jersey to Illinois. When capacity prices spike, every ratepayer in that territory feels it on their monthly bill. And according to the PJM Independent Market Monitor, data centers drove 63% of the price increase.

By 2028, the average PJM household will pay roughly $70 more per month in electricity costs than it did in 2023. Nobody asked for this. Nobody held a referendum on whether 67 million people should subsidize AI training clusters. It happened through the capacity market, a mechanism most ratepayers have never heard of.

How the Capacity Market Works (And How It Broke)

PJM runs a forward capacity auction called the Base Residual Auction three years before each delivery year. Power generators bid to guarantee availability when demand peaks. Clearing prices reflect how much new generation the grid needs relative to what exists. When supply is abundant, prices stay low. When demand outpaces supply, prices climb.

For years, this system worked as designed. Then data centers arrived in force. Northern Virginia alone hosts 199 operating data centers with 117 more in development. In Dominion Energy's service territory, data centers went from less than 5% of total electricity demand to roughly 40% in 15 years, according to the Piedmont Environmental Council. Nationwide, data centers consumed 176 TWh in 2025, drawing 41 GW at peak, equivalent to every nuclear power plant in the United States running simultaneously.

Capacity markets responded exactly as economics predicted. More demand, same supply, higher price. By the 2026–2027 auction, the clearing price hit $329.17/MW-day, slamming into FERC's price cap. A year later, the 2027–2028 auction hit the cap again at $333.44/MW-day and recorded a 6,625 MW reliability shortfall, a first in PJM history. Not enough generators exist in the pipeline to meet projected demand. (Important distinction: the 63% attribution applies specifically to the capacity price increase, not to overall electricity bills. Capacity charges are one component of residential bills alongside energy, transmission, and distribution costs.)

Without FERC's price cap, PJM's own market monitor estimated clearing prices could have exceeded $500/MW-day.

Original Analysis: The Externality Math

Here is a calculation that does not appear in any Big Tech earnings call. Amazon, Google, Meta, and Microsoft committed a combined $330 billion or more to AI infrastructure capex in 2025 alone. Amazon budgeted $100 billion, up from $19 billion in 2024. Google planned $75 billion, up from $33 billion. Meta signed 6.6 GW in nuclear power deals and broke ground on a $3.2 billion natural gas plant called Hyperion in Louisiana.

NRDC estimates that cumulative capacity-price costs to PJM ratepayers will reach $100–163 billion through 2033. (Note: this is a decade-long projection based on current auction trends; actual costs depend on future auction outcomes. It covers only capacity charges, not total bill increases.) That is just one of seven major US grid operators. ERCOT, MISO, CAISO, SPP, ISO-NE, and NYISO face their own data center surges.

Annualized, that NRDC estimate works out to roughly $10–16 billion per year imposed on PJM ratepayers. Against a $330 billion annual AI infrastructure budget, the ratio of Big Tech spending to ratepayer costs is approximately 20:1 to 33:1 in dollar terms. But the distributional asymmetry is what matters: that $10–16 billion per year is spread across 67 million people who never opted in, while the $330 billion is spent by four companies that chose this path. No other industry in American history has imposed involuntary grid costs at this scale and speed.

Metric 2023–2024 2027–2028 Change
PJM capacity price ($/MW-day) $28.92 $333.44 +1,053%
Data center share of PJM load ~8% ~18% (est.) +10 pts
Avg. household monthly impact Baseline +$70/mo
PJM reliability surplus Positive -6,625 MW First shortfall ever

Where Bills Are Climbing Now

Western Maryland ratepayers have already seen $18/month increases. Ohio residents are paying $16/month more. Virginia, ground zero for data center concentration, experienced a 13% electricity price surge above the national average, per JLARC (Joint Legislative Audit and Review Commission) analysis. JLARC projects that if data center growth continues on its current trajectory, Virginia residential bills will climb an additional $444 per year by 2040.

Retail electricity prices across the United States rose 42% between 2019 and 2025, outpacing the 29% CPI increase over the same period. Goldman Sachs projects data centers will add 0.1 percentage points to core inflation in both 2026 and 2027. That may sound trivial until you remember the Federal Reserve spent two years fighting to bring inflation down by fractions of a point.

Meta's Hyperion plant in Louisiana illustrates cost-shifting at the project level. Of that $3.2 billion facility, Louisiana ratepayers will shoulder roughly $550 million in grid infrastructure expenses, according to state utility filings. AI training power goes to Meta. Grid upgrades stay on everyone else's bill.

States Are Fighting Back

Virginia, where the backlash is most advanced, introduced SB 253 in the 2026 legislative session. SB 253 would shift distribution and capacity costs from residential ratepayers to data centers consuming 25 MW or more. Virginia's State Corporation Commission estimates the bill would produce a 3.4% residential rate reduction and a 15.8% data center rate increase. A Piedmont Environmental Council poll found 78% of Virginia voters blame data centers for rising bills.

Oregon became the first state to create a dedicated utility rate class for data centers, separating their cost allocation from other commercial and industrial users. Georgia, Maryland, Colorado, and Oklahoma have introduced parallel legislation. At least six states have moved toward data center construction moratoriums. Seven or more are reconsidering the generous tax incentives that lured data centers in the first place.

Political calculus is shifting because beneficiaries and bill-payers are different populations. Data centers create relatively few direct jobs per dollar invested. A $1 billion data center might employ 50 permanent staff. A $1 billion manufacturing plant employs thousands. When the only tangible local impact is a higher electricity bill, political tolerance evaporates quickly.

Strongest Case for Data Centers

Data centers generated $12.5 billion in Virginia GDP annually, employed 45,000+ workers, and contributed billions in property taxes as of 2025. Capacity price spikes also reflect decades of coal plant retirements and chronic underinvestment in new generation, not exclusively data center demand. At $28.92/MW-day, the 2023–2024 clearing price was historically depressed by temporary oversupply. Some correction was coming regardless.

Companies are also building their own generation. Meta's nuclear power agreements total 6.6 GW. Google is investing in geothermal. Amazon has signed nuclear deals across multiple states. Over time, these projects will add capacity to the grid, potentially reducing prices. If AI infrastructure drives a productivity boom, long-term economic gains could outweigh electricity cost increases several times over.

This argument deserves its full weight. Capacity markets are designed to signal scarcity and attract investment. High prices are supposed to incentivize new power plants. In that narrow sense, the system is working as intended, if painfully.

What This Analysis Doesn't Show

Several limitations deserve explicit acknowledgment. NRDC's $100–163 billion estimate depends on future auction outcomes that have not occurred, and actual costs could land higher or lower. PJM covers 13 states; ERCOT, MISO, and CAISO face different market structures and dynamics that may not produce equivalent impacts. Attribution of 63% to data centers comes from economic modeling by PJM's Independent Market Monitor, not direct metering of cause and effect. Residential bill impacts vary enormously by zone within PJM, with some areas seeing far less than $70/month and others seeing more. Finally, this analysis focuses on costs imposed. Economic benefits data centers bring to host communities through jobs, taxes, and GDP are real but are not quantified against rate increases here.

The Bottom Line

America's AI boom has an electricity bill, and 67 million people are paying it whether they use AI or not. PJM capacity prices have risen more than 1,000% in four years. PJM recorded its first-ever reliability shortfall. States are responding with legislation that would fundamentally restructure how data centers pay for power. This is the first real political backlash against AI infrastructure, and it is not coming from technologists worried about alignment or academics debating automation. It is coming from families opening their electricity bills.

Related