Sign In or Create an Account.

By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy

Energy

One Weird Trick to Build New Data Centers Without Breaking the Grid

Tech companies, developers, and banks are converging behind “flexible loads.”

Flexibility and data.
Heatmap Illustration/Getty Images

Electricity prices are up by over 5% so far this year — more than twice the overall rate of inflation — while utilities have proposed $29 billion worth of rate hikes so far this year, compared to $12 billion last year, according to electricity policy research group PowerLines. At the same time, new data centers are sprouting up everywhere as tech giants try to outpace each other — and their Chinese rivals — in the race to develop ever more advanced (and energy hungry) artificial intelligence systems, with hundreds of billions of dollars of new investments still in the pipeline.

You see the problem here?

In the PJM Interconnection, America’s largest electricity market which includes Virginia’s “data center alley” as part of its 13-state territory, some 30 gigawatts of a projected 32 total gigawatts of load growth through 2030 are expected to come from data centers.

“The onrush of demand has created significant upward pricing pressure and has raised future resource adequacy concerns,” David Mills, the chair of PJM’s board of managers, said in a letter last week announcing the beginning of a process to look into the issues raised by large load interconnection — i.e. getting data centers on the grid without exploding costs for other users of the grid or risking blackouts.

Customers in PJM are paying the price already, as increasingly scarce capacity has translated into upward-spiraling payments to generators, which then show up on retail electricity bills. New large loads can raise costs still further by requiring grid upgrades to accommodate the increased demand for power — costs that get passed down to all ratepayers. PJM alone has announced over $10 billion in transmission upgrades, according to research by Johns Hopkins scholar Abraham Silverman. “These new costs are putting significant upward pressure on customer bills,” Silverman wrote in a report with colleagues Suzanne Glatz and Mahala Lahvis, released in June.

“There’s increasing recognition that the path we’re on right now is not long-term sustainable,” Silverman told me when we spoke this week about the report. “Costs are increasing too fast. The amount of infrastructure we need to build is too much. We need to prioritize, and we need to make this data center expansion affordable for consumers. Right now it’s simply not. You can’t have multi-billion-dollar rate increases year over year.”

While it’s not clear precisely what role existing data center construction has played in electricity bill increases on a nationwide scale, rising electricity rates will likely become a political problem wherever and whenever they do hit, with data centers being the most visible manifestation of the pressures on the grid.

Charles Hua, the founder and executive director of PowerLines, called data centers “arguably the most important topic in energy,” but cautioned that outside of specific demonstrable instances (e.g. in PJM), linking them to utility rate increases can be “a very oversimplified narrative.” The business model for vertically integrated utilities can incentivize them to over-invest in local transmission, Hua pointed out. And even without new data center construction, the necessity of replacing and updating an aging grid would remain.

Still, the connection between large new sources of demand and higher prices is pretty easy to draw: Electricity grids are built to accommodate peak demand, while the bills customers receive are based on a combination of the fixed cost of maintaining the grid for everyone and the cost of the energy itself, therefore higher peak demand and more grid maintenance equals higher bills.

But what if data centers could use the existing transmission and generation system and not add to peak generation? That’s the promise of load flexibility.

If data centers could commit to not requiring power at times of extremely high demand, they could essentially piggyback on existing grid infrastructure. Widely cited research by Tyler Norris, Tim Profeta, Dalia Patino-Echeverri, and Adam Cowie-Haskell of Duke University demonstrated that curtailing large loads for as little as 0.5% of their annual uptime (177 hours of curtailment annually on average, with curtailment typically lasting just over two hours) could allow almost 100 gigawatts of new demand to connect to the grid without requiring extensive, costly upgrades.

The groundswell behind flexibility has rapidly gained institutional credibility. Last week, Google announced that it had reached deals with two utilities, Indiana Michigan Power and the Tennessee Valley Authority, to incorporate flexibility into how their data centers run. The Indiana Michigan Power contract will “allow [Google] to reduce or shift electricity demand to carry out non-urgent tasks during hours when the electric grid is under less stress,” the utility said.

Google has long been an innovator in energy procurement — it famously pioneered the power purchase agreement structure that has helped finance many a renewable energy development — and already has its fingers in many pots when it comes to grid flexibility. The company’s chief scientist, Jeff Dean, is an investor in Emerald AI, a software company that promises to help data centers work flexibly, while its urbanism-focused spinout Sidewalk Infrastructure Partners has backed Verrus, a demand-flexible data center developer.

Hyperscale developers aren’t the only big fish excited about data center flexibility. Financiers are, as well.

Goldman Sachs released a splashy report this week that cited Norris extensively (plus Heatmap). Data center flexibility promises to be a win-win-win, according to Goldman (which, of course, would love to finance an AI boom unhindered by higher retail electricity rates or long interconnection queues for new generation). “What if, thanks to curtailment, instead of overwhelming the grid, AI data centers became the shock absorbers that finally unlocked this stranded capacity?” the report asks.

The holy grail for developers and flexibility is not just saving money on electricity, which is a small cost compared to procuring advanced chips to train and run AI models. The real win would be to build new data centers faster. “Time to market is critical for AI companies,” the Goldman analysts wrote.

But creating a system where data centers can connect to the grid sooner if they promise to be flexible about power consumption would require immense institutional change for states, utilities, regulators, and power markets.

“We really don’t have existing service tiers in place for most jurisdictions that acknowledges and incentivizes flexible loads and plans around them,” Norris told me.

When I talked to Silverman, he told me that integrating flexibility into local decision-making could mean rewriting state utility regulations to allow a special pathway for data centers. It could also involve making local or state tax incentives contingent on flexibility.

Whatever the new structure looks like, the point is to “enshrine a policy that says, ‘data centers are different,’ and we are going to explicitly recognize those differences and tailor rules to data centers,” Silverman said. He pointed specifically to a piece of legislation in New Jersey that he consulted on, which would have utilities and regulators work together to come up with specific rate structures for data centers.

Norris also pointed to a proposal in the Southwest Power Pool, which runs down the spine of the country from the Dakotas to Louisiana, which would allow large loads like data centers to connect to the grid quickly “with the tradeoff of potential curtailment during periods of system stress to protect regional reliability,” the transmission organization said.

And there’s still more legal and regulatory work to be done before hyperscalers can take full advantage of those incentives, Norris told me. Utilities and their data center customers would have to come up with a rate structure that incorporates flexibility and faster interconnection, where more flexibility can allow for quicker timelines.

Speed is of the essence — not just to be able to link up more data centers, but also to avoid a political firestorm around rising electricity rates. There’s already a data center backlash brewing: The city of Tucson earlier this month rejected an Amazon facility in a unanimous city council vote, taken in front of a raucous, cheering crowd. Communities in Indiana, a popular location for data center construction, have rejected several projects.

The drama around PJM may be a test case for the rest of the country. After its 2024 capacity auction jumped came in at $15 billion, up from just over $2 billion the year before, complaints from Pennsylvania Governor Josh Shapiro led to a price cap on future auctions. PJM’s chief executive said in April that he would resign by the end of this year. A few months later, PJM’s next capacity auction hit the price cap.

“You had every major publication writing that AI data centers are causing electricity prices to spike” after the PJM capacity auction, Norris told me. “They lost that public relations battle.”

With more flexibility, there’s a chance for data center developers to tell a more positive story about how they affect the grid.

“It’s not just about avoiding additional costs,” Norris said. “There’s this opportunity that if you can mitigate additional cost, you can put downward cost on rates.” That’s almost putting things generously — data center developers might not have a choice.

You’re out of free articles.

Subscribe today to experience Heatmap’s expert analysis 
of climate change, clean energy, and sustainability.
To continue reading
Create a free account or sign in to unlock more free articles.
or
Please enter an email address
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Climate Tech

Exclusive: Octopus Energy Launches Battery-Powered Electricity Plan With Lunar

The companies are offering Texas ratepayers a three-year fixed-price contract that comes with participation in a virtual power plant.

Octopus and Lunar Energy.
Heatmap Illustration/Getty Images

Customers get a whole lot of choice in Texas’ deregulated electricity market — which provider to go with, fixed-rate or variable-rate plan, and contract length are all variables to consider. If a customer wants a home battery as well, that’s yet another exercise in complexity, involving coordination with the utility, installers, and contractors.

On Wednesday, residential battery manufacturer and virtual power plant provider Lunar Energy and U.K.-based retail electricity provider Octopus Energy announced a partnership to simplify all this. They plan to offer Texas electricity ratepayers a single package: a three-year fixed-rate contract, a 30-kilowatt-hour battery, and automatic participation in a statewide network of distributed energy resources, better known as a virtual power plant, or VPP.

Keep reading...Show less
Blue
AM Briefing

Blowing the Whistle

On Trump’s renewables embargo, Project Vault, and perovskite solar

Pollution.
Heatmap Illustration/Getty Images

Current conditions: Illinois far outpaces every other state for tornadoes so far this year, clocking 80, with Mississippi in a distant second with 43 • Western North Carolina’s Blue Ridge Mountains face high wildfire risk during the day and frost at night • A magnitude 7.4 earthquake off the coast of Honshu, Japan, has raised the risk of a tsunami.

THE TOP FIVE

1. Whistleblowers allege big problems with corporate carbon standards-setter

The nonprofit that sets the standards against which tens of thousands of companies worldwide measure their greenhouse gas emissions is secretive and ideologically tilted toward industry. That’s the conclusion of a new whistleblower report on which Heatmap’s Emily Pontecorvo got her hands yesterday. The problems at the Greenhouse Gas Protocol “are systemic,” and the nonprofit “seems to be moving further away from its commitment to accountability,” the report said. Danny Cullenward, the economist and lawyer focused on scientific integrity in climate science at the University of Pennsylvania’s Kleinman Center for Energy Policy who authored the report, sits on the Protocol’s Independent Standards Board. Due to a restrictive non-disclosure agreement preventing him from talking about what he has witnessed, he instead relied on publicly available information to illustrate the report. “Not only does the nonprofit community not have a voice on the board,” Cullenward wrote, but the absence of those voices “risks politicizing the work of scientist Board members.” Emily added: “While the Protocol’s official decision-making hierarchy deems scientific integrity as its top priority, in practice, scientists are left to defend the science to the business community.” The report follows a years-long process meant to bolster the group’s scientific credibility. “Critics have long faulted the Protocol for allowing companies to look far better on paper than they do to the atmosphere,” Emily explains. But creating standards that are both scientifically robust and feasible to implement is no easy feat.

Keep reading...Show less
Red
Carbon Removal

Leading Climate Standards Group Fraught With Secrecy and Bias, Whistleblowers Say

A new report shared exclusively with Heatmap documents failures of transparency and governance at the Greenhouse Gas Protocol.

Pollution and trees.
Heatmap Illustration/Getty Images

It is something of a miracle that tens of thousands of companies around the world voluntarily report their greenhouse gas emissions each year. In 2025, more than 22,100 businesses, together worth more than half the global stock market, disclosed this data. Unfortunately, it’s an open secret that many of their calculations are far off the mark.

This is not exactly their fault. To aid in the tedious process of tallying up carbon and to encourage a basic level of uniformity in how it’s done, companies rely on standards created by a nonprofit called the Greenhouse Gas Protocol. The group’s central challenge is ensuring that its standards are both credible and feasible — two qualities often in tension in greenhouse gas accounting. The method that produces the most accurate emissions inventory may not always be feasible, while the method that’s easy to implement may produce wildly inaccurate results.

Keep reading...Show less
Yellow