You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
Tech companies, developers, and banks are converging behind “flexible loads.”

Electricity prices are up by over 5% so far this year — more than twice the overall rate of inflation — while utilities have proposed $29 billion worth of rate hikes so far this year, compared to $12 billion last year, according to electricity policy research group PowerLines. At the same time, new data centers are sprouting up everywhere as tech giants try to outpace each other — and their Chinese rivals — in the race to develop ever more advanced (and energy hungry) artificial intelligence systems, with hundreds of billions of dollars of new investments still in the pipeline.
You see the problem here?
In the PJM Interconnection, America’s largest electricity market which includes Virginia’s “data center alley” as part of its 13-state territory, some 30 gigawatts of a projected 32 total gigawatts of load growth through 2030 are expected to come from data centers.
“The onrush of demand has created significant upward pricing pressure and has raised future resource adequacy concerns,” David Mills, the chair of PJM’s board of managers, said in a letter last week announcing the beginning of a process to look into the issues raised by large load interconnection — i.e. getting data centers on the grid without exploding costs for other users of the grid or risking blackouts.
Customers in PJM are paying the price already, as increasingly scarce capacity has translated into upward-spiraling payments to generators, which then show up on retail electricity bills. New large loads can raise costs still further by requiring grid upgrades to accommodate the increased demand for power — costs that get passed down to all ratepayers. PJM alone has announced over $10 billion in transmission upgrades, according to research by Johns Hopkins scholar Abraham Silverman. “These new costs are putting significant upward pressure on customer bills,” Silverman wrote in a report with colleagues Suzanne Glatz and Mahala Lahvis, released in June.
“There’s increasing recognition that the path we’re on right now is not long-term sustainable,” Silverman told me when we spoke this week about the report. “Costs are increasing too fast. The amount of infrastructure we need to build is too much. We need to prioritize, and we need to make this data center expansion affordable for consumers. Right now it’s simply not. You can’t have multi-billion-dollar rate increases year over year.”
While it’s not clear precisely what role existing data center construction has played in electricity bill increases on a nationwide scale, rising electricity rates will likely become a political problem wherever and whenever they do hit, with data centers being the most visible manifestation of the pressures on the grid.
Charles Hua, the founder and executive director of PowerLines, called data centers “arguably the most important topic in energy,” but cautioned that outside of specific demonstrable instances (e.g. in PJM), linking them to utility rate increases can be “a very oversimplified narrative.” The business model for vertically integrated utilities can incentivize them to over-invest in local transmission, Hua pointed out. And even without new data center construction, the necessity of replacing and updating an aging grid would remain.
Still, the connection between large new sources of demand and higher prices is pretty easy to draw: Electricity grids are built to accommodate peak demand, while the bills customers receive are based on a combination of the fixed cost of maintaining the grid for everyone and the cost of the energy itself, therefore higher peak demand and more grid maintenance equals higher bills.
But what if data centers could use the existing transmission and generation system and not add to peak generation? That’s the promise of load flexibility.
If data centers could commit to not requiring power at times of extremely high demand, they could essentially piggyback on existing grid infrastructure. Widely cited research by Tyler Norris, Tim Profeta, Dalia Patino-Echeverri, and Adam Cowie-Haskell of Duke University demonstrated that curtailing large loads for as little as 0.5% of their annual uptime (177 hours of curtailment annually on average, with curtailment typically lasting just over two hours) could allow almost 100 gigawatts of new demand to connect to the grid without requiring extensive, costly upgrades.
The groundswell behind flexibility has rapidly gained institutional credibility. Last week, Google announced that it had reached deals with two utilities, Indiana Michigan Power and the Tennessee Valley Authority, to incorporate flexibility into how their data centers run. The Indiana Michigan Power contract will “allow [Google] to reduce or shift electricity demand to carry out non-urgent tasks during hours when the electric grid is under less stress,” the utility said.
Google has long been an innovator in energy procurement — it famously pioneered the power purchase agreement structure that has helped finance many a renewable energy development — and already has its fingers in many pots when it comes to grid flexibility. The company’s chief scientist, Jeff Dean, is an investor in Emerald AI, a software company that promises to help data centers work flexibly, while its urbanism-focused spinout Sidewalk Infrastructure Partners has backed Verrus, a demand-flexible data center developer.
Hyperscale developers aren’t the only big fish excited about data center flexibility. Financiers are, as well.
Goldman Sachs released a splashy report this week that cited Norris extensively (plus Heatmap). Data center flexibility promises to be a win-win-win, according to Goldman (which, of course, would love to finance an AI boom unhindered by higher retail electricity rates or long interconnection queues for new generation). “What if, thanks to curtailment, instead of overwhelming the grid, AI data centers became the shock absorbers that finally unlocked this stranded capacity?” the report asks.
The holy grail for developers and flexibility is not just saving money on electricity, which is a small cost compared to procuring advanced chips to train and run AI models. The real win would be to build new data centers faster. “Time to market is critical for AI companies,” the Goldman analysts wrote.
But creating a system where data centers can connect to the grid sooner if they promise to be flexible about power consumption would require immense institutional change for states, utilities, regulators, and power markets.
“We really don’t have existing service tiers in place for most jurisdictions that acknowledges and incentivizes flexible loads and plans around them,” Norris told me.
When I talked to Silverman, he told me that integrating flexibility into local decision-making could mean rewriting state utility regulations to allow a special pathway for data centers. It could also involve making local or state tax incentives contingent on flexibility.
Whatever the new structure looks like, the point is to “enshrine a policy that says, ‘data centers are different,’ and we are going to explicitly recognize those differences and tailor rules to data centers,” Silverman said. He pointed specifically to a piece of legislation in New Jersey that he consulted on, which would have utilities and regulators work together to come up with specific rate structures for data centers.
Norris also pointed to a proposal in the Southwest Power Pool, which runs down the spine of the country from the Dakotas to Louisiana, which would allow large loads like data centers to connect to the grid quickly “with the tradeoff of potential curtailment during periods of system stress to protect regional reliability,” the transmission organization said.
And there’s still more legal and regulatory work to be done before hyperscalers can take full advantage of those incentives, Norris told me. Utilities and their data center customers would have to come up with a rate structure that incorporates flexibility and faster interconnection, where more flexibility can allow for quicker timelines.
Speed is of the essence — not just to be able to link up more data centers, but also to avoid a political firestorm around rising electricity rates. There’s already a data center backlash brewing: The city of Tucson earlier this month rejected an Amazon facility in a unanimous city council vote, taken in front of a raucous, cheering crowd. Communities in Indiana, a popular location for data center construction, have rejected several projects.
The drama around PJM may be a test case for the rest of the country. After its 2024 capacity auction jumped came in at $15 billion, up from just over $2 billion the year before, complaints from Pennsylvania Governor Josh Shapiro led to a price cap on future auctions. PJM’s chief executive said in April that he would resign by the end of this year. A few months later, PJM’s next capacity auction hit the price cap.
“You had every major publication writing that AI data centers are causing electricity prices to spike” after the PJM capacity auction, Norris told me. “They lost that public relations battle.”
With more flexibility, there’s a chance for data center developers to tell a more positive story about how they affect the grid.
“It’s not just about avoiding additional costs,” Norris said. “There’s this opportunity that if you can mitigate additional cost, you can put downward cost on rates.” That’s almost putting things generously — data center developers might not have a choice.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
With more electric heating in the Northeast comes greater strains on the grid.
The electric grid is built for heat. The days when the system is under the most stress are typically humid summer evenings, when air conditioning is still going full blast, appliances are being turned on as commuters return home, and solar generation is fading, stretching the generation and distribution grid to its limits.
But as home heating and transportation goes increasingly electric, more of the country — even some of the chilliest areas — may start to struggle with demand that peaks in the winter.
While summer demand peaks are challenging, there’s at least a vision for how to deal with them without generating excessive greenhouse gas emissions — namely battery storage, which essentially holds excess solar power generated in the afternoon in reserve for the evening. In states with lots of renewables on the grid already, like California and Texas, storage has been helping smooth out and avoid reliability issues on peak demand days.
The winter challenge is that you can have long periods of cold weather and little sun, stressing every part of the grid. The natural gas production and distribution systems can struggle in the cold with wellheads freezing up and mechanical failure at processing facilities, just as demand for home heating soars, whether provided by piped gas or electricity generated from gas-fired power plants.
In its recent annual seasonal reliability assessment, the North American Reliability Corporation, a standard-setting body for grid operators, found that “much of North America is again at an elevated risk of having insufficient energy supplies” should it encounter “extreme operating conditions,” i.e. “any prolonged, wide-area cold snaps.”
NERC cited growing electricity demand and the difficulty operating generators in the winter, especially those relying on natural gas. In 2021, Winter Storm Uri effectively shut down Texas’ grid for several days as generation and distribution of natural gas literally froze up while demand for electric heating soared. Millions of Texans were left exposed to extreme low temperatures, and at least 246 died as a result.
Some parts of the country already experience winter peaks in energy demand, especially places like North Carolina and Oregon, which “have winters that are chilly enough to require some heating, but not so cold that electric heating is rare,” in the words of North Carolina State University professor Jeremiah Johnson. "Not too many Mainers or Michiganders heat their homes with electricity,” he said.
But that might not be true for long.
New England may be cold and dark in the winter, but it’s liberal all year round. That means the region’s constituent states have adopted aggressive climate change and decarbonization goals that will stretch their available renewable resources, especially during the coldest days, weeks, and months.
The region’s existing energy system already struggles with winter. New England’s natural gas system is limited by insufficient pipeline capacity, so during particularly cold days, power plants end up burning oil as natural gas is diverted from generating electricity to heating homes.
New England’s Independent System Operator projects that winter demand will peak at just above 21 gigawatts this year — its all-time winter peak is 22.8 gigawatts, summer is 28.1 — which ISO-NE says the region is well-prepared for, with 31 gigawatts of available capacity. That includes energy from the Vineyard Wind offshore wind project, which is still facing activist opposition, as well as imported hydropower from Quebec.
But going forward, with Massachusetts aiming to reduce emissions 50% by 2030 (though state lawmakers are trying to undo that goal) and reach net-zero emissions by 2050 — and nearly the entire region envisioning at least 80% emissions reductions by 2050 — that winter peak is expected to soar. The non-carbon-emitting energy generation necessary to meet that demand, meanwhile, is still largely unbuilt.
By the mid 2030s, ISO-NE expects its winter peak to surpass its summer peak, with peak demand perhaps reaching as high as 57 gigawatts, more than double the system’s all-time peak load. Those last few gigawatts of this load will be tricky — and expensive — to serve. ISO-NE estimates that each gigawatt from 51 to 57 would cost $1.5 billion for transmission expansion alone.
ISO-NE also found that “the battery fleet may be depleted quickly and then struggle to recharge during the winter months,” which is precisely when “batteries may be needed most to fill supply gaps during periods of high demand due to cold weather, as well as periods of low production from wind and solar resources.” Some 600 megawatts of battery storage capacity has come online in the last decade in ISO-NE, and there are state mandates for at least 7 more gigawatts between 2030 and 2033.
There will also be a “continued need for fuel-secure dispatchable resources” through 2050, ISO-NE has found — that is, something to fill the role that natural gas, oil, and even coal play on the coldest days and longest cold stretches of the year.
This could mean “vast quantities of seasonal storage,” like 100-hour batteries, or alternative fuels like synthetic natural gas (produced with a combination of direct air capture and electrolysis, all powered by carbon-free power), hydrogen, biodiesel, or renewable diesel. And this is all assuming a steady buildout of renewable power — including over a gigawatt per year of offshore wind capacity added through 2050 — that will be difficult if not impossible to accomplish given the current policy and administrative roadblocks.
While planning for the transmission and generation system of 2050 may be slightly fanciful, especially as the climate policy environment — and the literal environment — are changing rapidly, grid operators in cold regions are worried about the far nearer term.
From 2027 to 2032, ISO-NE analyses “indicate an increasing energy shortfall risk profile,” said ISO-NE planning official Stephen George in a 2024 presentation.
“What keeps me up at night is the winter of 2032,” Richard Dewey, chief executive of the neighboring New York Independent System Operator, said at a 2024 conference. “I don’t know what fills that gap in the year 2032.”
The future of the American electric grid is being determined in the docket of the Federal Energy Regulatory Commission.
The Trump administration tasked federal energy regulators last month to come up with new rules that would allow large loads — i.e. data centers — to connect to the grid faster without ballooning electricity bills. The order has set off a flurry of reactions, as the major players in the electricity system — the data center developers, the power producers, the utilities — jockey to ensure that any new rules don’t impinge upon their business models. The initial public comment period closed last week, meaning now FERC will have to go through hundreds of comments from industry, government, and advocacy stakeholders, hoping to help shape the rule before it’s released at the end of April.
They’ll have a lot to sift through. Opinions ranged from skeptical to cautiously supportive to fully supportive, with imperfect alignment among trade groups and individual companies.
The Utilities
When the DOE first asked FERC to get to work on a rule, several experts identified a possible conflict with utilities, namely the idea that data centers “should be responsible for 100% of the network upgrades that they are assigned through the interconnection studies.” Utilities typically like to put new transmission into their rate base, where they can earn a regulated rate of return on their investments that’s recouped from payments from all their customers. And lo, utilities were largely skeptical of the exercise.
The Edison Electric Institute, which represents investor-owned utilities, wrote in its comments to FERC that the new rule should require large load customers to pay for their share of the transmission system costs, i.e. not the full cost of network upgrades.
EEI claimed that these network costs can add up to the “tens to hundreds of millions of dollars” that should be assigned in a way that allows utilities “to earn a return of and on the entirety of the transmission network.”
In short, the utilities are defending something like the traditional model, where utilities connect all customers and spread out the costs of doing so among the entire customer base. That model has come under increasing stress thanks to the flood of data center interconnection requests, however. The high costs in some markets, like PJM, have also led some scholars and elected officials to seriously reconsider the nature of utility regulation. Still, that model has been largely good for the utilities — and they show no sign of wanting to give it up.
The Hyperscalers
The biggest technology companies, like Google, Microsoft, and Meta, and their trade groups want to make sure their ability to connect to the grid will not be impeded by new rules.
Ari Peskoe, an energy law professor at Harvard Law School, told me that existing processes for interconnection are likely working out well for the biggest data center developers and they may not be eager to rock the boat with a federal overhaul. “Presumably utilities are lining up to do deals with them because they have so much money,” Peskoe said.
In its letter to FERC, the DOE suggested that the commission could expedite interconnection of large loads “that agree to be curtailable.” That would entail users of a lot of electricity ramping down use while the grid was under stress, as well as co-locating projects with new sources of energy generation that could serve the grid as a whole. This approach has picked up steam among researchers and some data center developers, although with some cautions and caveats.
The Clean Energy Buyers Association, which represents many large technology companies, wrote in its comment that such flexibility should be “structured to enable innovation and competition through voluntary pathways rather than mandates,” echoing criticism of a proposal by the electricity market PJM Interconnection that could have forced large loads to be eligible for curtailment.
The Data Center Coalition, another big tech trade group representing many key players in the data center industry, emphasized throughout their comment that any reform to interconnection should still allow data centers to simply connect to the grid, without requiring or unduly favoring “hybrid” or co-location approaches.
“Timely, predictable, and nondiscriminatory access to interconnection service for stand-alone load is… critical… to the continued functioning of the market itself,” the Data Center Coalition wrote.
The hyperscalers themselves largely echoed this message, albeit with some differences in emphasis. They did not want any of their existing arrangements — which have allowed breakneck data center development — to be disrupted or to be forced into operating their data centers in any particular fashion.
Microsoft wrote that it was in favor of “voluntarily curtailable loads,” but cautioned that “most data centers today have limited curtailment capability,” and worried about “operational reliability risks.” In short, don’t railroad us into something our data centers aren’t really set up to do.
OpenAI wrote a short comment, likely its first ever appearance in a FERC docket, where it argued for “an optional curtailable-load pathway” that would allow for faster interconnection, echoing comments it had made in a letter to the White House.
Meta, meanwhile, argued against any binding rule at all, saying instead that FERC “should consider adopting guidance, best practices, and, if appropriate, minimum standards for large load interconnection rather than promulgating a binding, detailed rule.” After all, its deploying data centers gigawatts at a time and has been able to reach deals with utilities to secure power.
The Generators
Perhaps the most fulsome support for the broadest version of the DOE’s proposal came from the generators. The Electrical Power Supply Association, an independent power producer trade group, wrote that more standardized, transparent “rules of the road” are needed to allow large loads like data centers “to interconnect to the transmission system efficiently and fairly, and to be able to do so quickly.” It also called on FERC to speed up its reviews of interconnection requests.
Constellation, which operates a 32-gigawatt generation fleet with a large nuclear business, said that it “agrees with the motivations and principles outlined in the [Department of Energy’s proposal] and the need for clear rules to allow the timely interconnection of large loads and their co-location with generators.” It also called for faster implementation of large load interconnection principles in PJM, the nation’s largest electricity market, “where data center development has been stymied by disagreements and uncertainty over who controls the timing and nature of large load interconnections, and over the terms of any ensuing transmission service.” Constellation specifically called out utilities for excessive influence over PJM rulemaking and procedures.
Constellation’s stance shouldn’t be surprising, Peskoe told me. From the perspective of independent power producers, enabling data centers to quickly and directly work with regional transmission organizations and generators to come online is “generally going to be better for the generators,” Peskoe said, while utilities “want to be the gatekeeper.”
In the end, the fight over data center interconnection may not have much to do with data centers — it’s just one battle after another between generators and utilities.
The senator spoke at a Heatmap event in Washington, D.C. last week about the state of U.S. manufacturing.
At Heatmap’s event, “Onshoring the Electric Revolution,” held last week in Washington, D.C. every guest agreed: The U.S. is falling behind in the race to build the technologies of the future.
Senator Catherine Cortez Masto of Nevada, a Democrat who sits on the Senate’s energy and natural resources committee, expressed frustration with the Trump administration rolling back policies in the Inflation Reduction Act and Infrastructure Investment and Jobs Act meant to support critical minerals companies. “If we want to, in this country, lead in 21st century technology, why aren’t we starting with the extraction of the critical minerals that we need for that technology?” she asked.
At the same time, Cortez Masto also seemed hopeful that the Senate would move forward on both permitting and critical minerals legislation. “After we get back from the Thanksgiving holiday, there is going to be a number of bills that we’re looking at marking up and moving through the committee,” Cortez Masto said. That may well include the SPEED Act, a permitting bill with bipartisan support that passed the House Natural Resources Committee late last week.
Friction in the permitting of new energy and transmission projects is one of the key factors slowing down the transition to clean energy — though fossil fuel companies also have an interest in the process.
Thomas Hochman, the Foundation of American Innovation’s director of infrastructure policy, talked about how legislation could protect energy projects of all stripes from executive branch interference.
“The oil and gas industry is really, really interested in seeing tech-neutral language on this front because they’re worried that the same tools that have been uncovered to block wind and solar will then come back and block oil and gas,” Hochman said.
While permitting dominated the conversation, it was not the only topic on panelists’ minds.
“There’s a lot of talk about permitting,” said Michael Tubman, the senior director of federal affairs at Lucid Motors. “It’s not just about permits. There’s a lot more to be done. And one of those important things is those mines have to have the funding available.”
Michael Bruce, a partner at the venture capital firm Emerson Collective, thinks that other government actions, such as supporting domestic demand, would help businesses in the critical minerals space.
“You need to have demand,” he said. “And if you don’t have demand, you don’t have a business.”
Like Cortez Masto, Bruce lamented the decline of U.S. mining in the face of China’s supply chain dominance.
“We do [mining] better than anyone else in the world,” said Bruce. “But we’ve got to give [mining companies] permission to return. We have a few [projects] that have been waiting for permits for upwards of 25 years.”