← Back to all articles ⚡ Energy

2,200 GW of Clean Energy Is Stuck in a Queue. Data Centers Are Building 56 GW of Gas Plants Instead.

The US interconnection queue holds enough solar, wind, and battery projects to nearly double the grid's installed capacity. Only 19% will ever reach commercial operation. The average wait is five years. AI data centers need power in twelve months. That math has a predictable solution, and it runs on natural gas.

Rows of high-voltage transmission towers stretching toward a distant horizon, with construction cranes building a natural gas plant in the foreground

Two thousand two hundred gigawatts. That is how much generation and storage capacity is currently sitting in US interconnection queues, waiting for permission to connect to the grid. According to Lawrence Berkeley National Laboratory's Queued Up tracker, updated through the end of 2024, the backlog has nearly doubled the total installed capacity of the American electric grid. Solar, wind, and battery projects make up the overwhelming majority. A clean energy future is designed, financed, and ready to build. It simply cannot get a plug.

Meanwhile, a parallel infrastructure buildout is moving at a different speed entirely. A February 2026 analysis by Cleanview identified 46 data center projects, with a combined capacity of 56 GW, that plan to build their own power generation behind the meter. Ninety percent of those projects were announced in 2025 alone. Roughly 75% of the generation equipment identified at these sites is natural gas: turbines from GE Vernova and Siemens, reciprocating engine clusters from Caterpillar and Doosan. Equipment orders have been placed. Crews are working through the night.

These two numbers, 2,200 GW stuck and 56 GW sprinting, tell a single story. It is a mechanism that converts clean energy ambition into fossil fuel reality.

The Queue by the Numbers

The interconnection process works like this: a developer submits a request to connect a new power plant to the transmission grid. Next, the system operator conducts studies to determine what upgrades are needed to maintain reliability. Finally, the developer pays for those upgrades, signs an interconnection agreement, and begins construction. In theory, this is orderly. In practice, it is a catastrophe.

According to LBNL's data, the average time from initial interconnection request to commercial operation rose to approximately five years by 2024, up from under two years in 2008. Of all projects that entered the queue between 2000 and 2019, only 19% had reached commercial operation by the end of 2024. Most of the rest either withdrew, often because interconnection costs ballooned beyond 10% of total project capital expenditure, or they remain in limbo.

Metric Value Source
Total capacity in US interconnection queues 2,200 GW LBNL, end of 2024
Current installed US grid capacity ~1,300 GW EIA
Queue completion rate (2000-2019 cohort) 19% LBNL
Average request-to-operation time ~5 years LBNL
New capacity planned for 2026 86 GW EIA, Feb 2026
Data center load growth target by 2030 90 GW IIR / PowerGen 2026

The bottleneck is structural. Grid operators are understaffed. Study processes are sequential, not parallel. A single project's withdrawal can trigger restudies for every project behind it in the queue. Transmission capacity constraints mean that even approved projects face years of waiting for physical upgrades: new substations, upgraded transformers, rebuilt transmission lines. Each step requires separate permitting. RMI noted in March 2026 that FERC has initiated rulemaking to address load interconnection, but load interconnection reform alone does nothing if the generation to serve that load cannot get connected either.

The Data Center Speed Problem

AI data centers do not operate on five-year timelines. At the PowerGen International 2026 conference, Industrial Info Resources reported that the US has approximately $2.4 trillion in AI data center development underway. More than 70 projects are scoped at 1 GW or more of peak demand. For context, 1 GW of continuous load is enough to power roughly one million homes.

US electricity demand has risen from about 23 GW of new load in 2023 to 42 GW today, with 32 GW more under construction. NERC projects that summer peak demand will surge by 224 GW over the next decade, a 69% increase from its prior 10-year projection. IIR forecasts 90 GW of data center demand alone by 2030.

Time is the critical variable. Hyperscalers want their facilities online in 12 to 24 months. On the grid interconnection side, the queue offers a five-year average with no guarantee of completion. That three-to-four-year gap is not an engineering problem. It is a bureaucratic one. And it has a simple market solution: skip the queue entirely.

The Behind-the-Meter Workaround

Behind-the-meter generation means building your own power plant on the same property as the data center, bypassing the grid interconnection process altogether. Cleanview's analysis of 46 such projects found that this approach has gone from niche to mainstream in less than a year. It started with xAI trucking mobile generators into Memphis. It has scaled to multi-gigawatt permanent gas installations.

What distinguishes this wave from conventional backup power is intent. These are not diesel generators for blackout resilience. They are primary power plants designed to run data centers continuously. Their equipment reflects this: combined-cycle gas turbines, not emergency gensets. And the numbers are staggering: 56 GW of behind-the-meter capacity across the 46 identified projects, representing roughly 30% of all planned US data center capacity.

Equipment procurement tells the real story. Cleanview identified specific equipment deals at approximately two-thirds of tracked projects. Of the equipment that could be identified, 75% was natural gas. Press releases from these same developers often mention "all of the above" strategies with solar and storage. Purchase orders tell a different story.

The Emissions Math Nobody Is Running

Here is a calculation that deserves more attention. If 56 GW of behind-the-meter gas generation operates at a capacity factor of 0.85, which is conservative for a data center running near-continuously, annual generation would be approximately 417 TWh. Natural gas combined-cycle plants emit roughly 0.41 metric tons of CO2 per MWh. But many behind-the-meter installations use simple-cycle turbines or reciprocating engines, which are less efficient. Using a blended rate of 0.50 metric tons per MWh, the annual emissions from these facilities would be approximately 209 million metric tons of CO2.

Sector Annual CO2 (million metric tons)
US commercial aviation ~180
Behind-the-meter data center gas (estimated) ~209
US cement industry ~70
US steel industry ~80

If these projects proceed as planned, behind-the-meter data center gas generation would produce more CO2 annually than the entire US commercial aviation sector. For an industry that has made net-zero commitments a centerpiece of its public messaging, the disconnect between press releases and purchase orders is difficult to square.

This calculation carries important caveats: not all 56 GW will be built, some facilities will use combined-cycle turbines at better heat rates than the blended estimate, and some developers will genuinely deploy renewables alongside gas. But even at half the estimated capacity, 104 million metric tons of CO2 would place behind-the-meter data center gas in the same emissions tier as the US steel and cement industries combined.

The Google Model and Its Limits

Not every data center developer is choosing the pure gas route. Google's 1 GW data center in Van Buren Township, Michigan, announced in March 2026 with utility partner DTE Energy, committed to procuring 2.7 GW of new solar, storage, and demand flexibility resources. DTE filed contracts with the Michigan Public Service Commission and agreed to a contested regulatory process, unlike the expedited path taken for a separate Oracle/OpenAI facility.

The Google-DTE model demonstrates that clean energy procurement for data centers is possible. But it also demonstrates why it is rare. Google is willing to invest 2.7 GW of generation capacity for 1 GW of load, a 2.7:1 ratio that accounts for solar intermittency and storage round-trip losses. Google has the capital, the long-term planning horizon, and the reputational incentive to absorb that ratio. Most data center developers, particularly those racing to deploy AI capacity for competitive advantage, do not.

Virginia's SB 508, passed in early 2026, offers another workaround: it allows new storage and solar resources to connect using "surplus interconnection service," essentially borrowing unused rights from existing facilities. It is a creative patch. But patches do not fix a system where the fundamental throughput is an order of magnitude below the demand.

The Strongest Counterargument

The most credible defense of behind-the-meter gas is that it functions as a bridge. Data center developers argue that gas plants provide immediate power while their clean energy projects work through the interconnection queue. Google's Michigan deal includes clean resources on a longer timeline. Microsoft has signed nuclear power purchase agreements. Amazon is investing in small modular reactors. The gas plants, the argument goes, will eventually be supplemented or replaced.

This argument deserves to be engaged seriously because it is not wrong in theory. Empirical evidence says otherwise. Combined-cycle gas turbines have useful lives of 30 to 40 years. The equipment being installed, GE Vernova aeroderivative turbines, Caterpillar reciprocating engine clusters, is not temporary infrastructure. It is not containerized or modular in the sense that it can be easily decommissioned. Once bolted down and permitted, it runs. A 75% gas ratio in identified equipment does not match a "bridge" narrative. It matches a primary generation strategy with clean energy branding layered on top.

What Would Actually Fix This

The interconnection queue is a policy failure with identifiable solutions. RMI catalogues several: cluster-based studies that evaluate groups of projects together instead of sequentially, "fast-lane" processes for projects at sites with existing grid capacity, automated study tools (the SUGAR software is already deployed by several transmission providers and has cut study times significantly), and proactive transmission planning that builds grid capacity ahead of demand rather than waiting for a developer to request it.

FERC Order 2023, finalized in 2024, reformed some aspects of the generator interconnection process, introducing penalties for speculative queue entries and shortening study timelines. But its effects are only beginning to filter through, and the order does not address the fundamental mismatch between queue capacity and load growth. As RMI noted, current reforms are "almost certainly insufficient" given projected demand increases.

Simple math: if queue completion rates remain at 19%, the 2,200 GW currently in the queue will yield approximately 418 GW of operational capacity. Data center demand alone will require at least 90 GW of load by 2030, which implies roughly 200-270 GW of generation (after accounting for renewable intermittency and storage losses). That is 48-65% of the queue's total expected yield, leaving the rest of the economy, electrified transport, heat pumps, industrial electrification, competing for the remainder.

What Needs Watching

What You Can Do

If you work in utility planning or transmission development, advocate for cluster-based interconnection studies and proactive transmission buildout. That sequential study model was designed for an era when a handful of generators connected per year. It cannot handle the current volume. Automated study tools exist and have demonstrated substantial time savings where deployed.

If you are a data center developer or investor, build the emissions accounting for behind-the-meter gas into your capital planning now. Today's regulatory gap that allows multi-gigawatt gas installations without major environmental review is unlikely to persist. Projects that lock in gas-only generation may face retrofitting costs or carbon pricing exposure within the operational life of the equipment.

If you live near a proposed data center, examine the power source, not just the facility footprint. "Data center" permitting debates tend to focus on land use, water consumption, and noise. Climate impact from behind-the-meter gas generation may be the largest externality, and it is the one least likely to appear in local zoning hearings.

Limitations

This analysis relies on Cleanview's identified subset of behind-the-meter projects. The actual pipeline may be larger or smaller. The emissions calculation assumes all 56 GW reaches operation at a 0.85 capacity factor with a blended emissions rate of 0.50 metric tons CO2 per MWh. Real-world operations will vary by equipment type, utilization rate, and whether developers deploy partial renewable generation alongside gas. Not all 2,200 GW in the interconnection queue is clean energy; some projects are natural gas. The 19% completion rate is historical and could improve under FERC Order 2023 reforms, though early evidence of improvement is limited.

The Bottom Line

The United States is adding 86 GW of new generating capacity in 2026, the largest single-year build in over two decades. Solar accounts for 51% of additions. Battery storage accounts for 28%. On paper, the energy transition is accelerating. In practice, 2,200 GW of clean energy projects are trapped in a queue that kills four out of five of them, while AI data centers are building 56 GW of gas plants on the other side of the fence. The interconnection queue is not a technical problem. It is not a financing problem. It is a paperwork problem, and it is converting the largest clean energy pipeline in American history into a fossil fuel subsidy. Every month the queue remains broken is another month of turbine orders, concrete pours, and 30-year gas commitments that will still be running when the queue finally clears.

Related Articles