You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
Just turn them off sometimes, according to new research from Duke University.
Grid planners have entered a new reality. After years of stagnant growth, utilities are forecasting accelerating electricity demand from artificial intelligence and other energy-intense industries and using it to justify building out more natural gas power plants and keep old coal plants online. The new administration has declared that the United States is in an “energy emergency,” bemoaning that the country’s generating capacity is “far too inadequate to meet our Nation’s needs.” Or, as President Trump put it at the Republican National Convention, “AI needs tremendous — literally, twice the electricity that’s available now in our country, can you imagine?”
The same logic also works the other way — the projected needs of data centers and manufacturing landed some power producers among the best performing stocks of 2024. And when it looked like artificial intelligence might not be as energy intensive as those producers assumed thanks to the efficiency of DeepSeek’s open source models, shares in companies that own power plants and build gas turbines crashed.
Both industry and policymakers seem convinced that the addition of new, large sources of power demand must be met with more generation and expensive investments to upgrade the grid.
But what if it doesn’t?
That’s the question Tyler Norris, Tim Profeta, Dalia Patino-Echeverri, and Adam Cowie-Haskell of the Nicholas Institute of Energy, Environment and Stability at Duke University tried to answer in a paper released Tuesday.
Their core finding: that the United States could add 76 gigawatts of new load — about a tenth of the peak electricity demand across the whole country — without having to upgrade the electrical system or add new generation. There’s just one catch: Those new loads must be “curtailed” (i.e. not powered) for up to one-quarter-of-one-percent of their maximum time online. That’s it — that’s the whole catch.
“We were very surprised,” Norris told me, referring to the amount of power freed up by data centers if they could curtail their usage at high usage times.
“It goes against the grain of the current paradigm,” he said, “that we have no headroom, and that we have to make massive expansion of the system to accommodate new load and generation.”
The electricity grid is built to accommodate the peak demand of the system, which often occurs during the hottest days of summer or the coldest days of winter. That means much grid infrastructure is built out solely to accommodate power demand that occurs over just a few days of the year, and even then for only part of those days. Thus it follows that if those peaks can be shaved by demand being reduced, then the existing grid can accommodate much more new demand.
This is the logic of longstanding “demand response” programs, whether they involve retail consumers agreeing not to adjust their thermostats outside a certain range or factories shuttering for prescribed time periods in exchange for payments from the grid authority. In very flexible markets, such as Texas’ ERCOT, some data center customers (namely cryptominers) get a substantial portion of their overall revenue by agreeing to curtail their use of electricity during times of grid stress.
While Norris cautioned that readers of the report shouldn’t think this means we won’t need any new grid capacity, he argued that the analysis “can enable more focus of limited resources on the most valuable upgrades to the system.”
Instead of focusing on expensive upgrades needed to accommodate the new demand on the grid, the Duke researchers asked what new sources of demand could do for the grid as a whole. Ask not what the grid can do for you, ask what you can do for the grid.
“By strategically timing or curtailing demand, these flexible loads can minimize their impact on peak periods,” they write. “In doing so, they help existing customers by improving the overall utilization rate — thereby lowering the per-unit cost of electricity — and reduce the likelihood that expensive new peaking plants or network expansions may be needed.” urtailment of large loads, they argue, can make the grid more efficient by utilizing existing equipment more fully and avoiding expensive upgrades that all users might have to pay for.
They found that when new large loads are curtailed for up to 0.25% of their maximum uptime, the average time offline amounts to just over an hour-and-a-half at a go, with 85 hours of load curtailment per year on average.
“You’re able to add incremental load to accept flexibility in most stressed periods,” Norris said. “Most hours of the year we’re not that close to the maximum peaks.”
In the nation’s largest electricity trading market, PJM Interconnection, this quarter-percent of total uptime curtailment would enable the grid to bring online over 13 gigawatts of new data centers — about the capacity of 13 new, large nuclear reactors — while maintaining PJM’s planners’ desired amount of generation capacity. In other words, that’s up to 13 gigawatts of reactors PJM no longer has to build, as long as that new load can be curtailed for 0.25% of its maximum uptime.
But why would data center developers agree to go offline when demand for electricity rises?
It’s not just because it could help the developers maintain their imperiled sustainability goals. It also presents an opportunity to solve the hardest problem for building out new data centers. One of the key limiting factors to getting data centers online is so-called “time to power,” i.e. how long it takes for the grid to be upgraded, either with new transmission equipment or generation, so that a data center can get up and running. According to estimates from the consulting firm McKinsey, a data center project can be developed in as little as a year and a half — but only if there’s already power available. Otherwise the timeline can run several years.
“There’s a clear value add,” Norris said. There are “very few locations to interconnect multi-hundred megawatt or gigawatt load in near-term fashion. If they accept flexibility for provision interim period, that allows them to get online more quickly.”
This “time to power” problem has motivated a flowering of unconventional ideas to power data centers, whether it’s large-scale deployment of on-site solar power (with some gas turbines) in the Southwest, renewables adjacent to data centers, co-located natural gas, or buying whole existing nuclear power plants.
But there may be a far simpler answer.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
A conversation with Elizabeth McCarthy of the Breakthrough Institute.
This week’s conversation is with Elizabeth McCarthy of the Breakthrough Institute. Elizabeth was one of several researchers involved in a comprehensive review of a decade of energy project litigation – between 2013 and 2022 – under the National Environment Policy Act. Notably, the review – which Breakthrough released a few weeks ago – found that a lot of energy projects get tied up in NEPA litigation. While she and her colleagues ultimately found fossil fuels are more vulnerable to this problem than renewables, the entire sector has a common enemy: difficulty of developing on federal lands because of NEPA. So I called her up this week to chat about what this research found.
The following conversation was lightly edited for clarity.
So why are you so fixated on NEPA?
Personally and institutionally, [Breakthrough is] curious about all regulatory policy – land use, environmental regulatory policy – and we see NEPA as the thing that connects them all. If we understand how that’s functioning at a high level, we can start to pull at the strings of other players. So, we wanted to understand the barrier that touches the most projects.
What aspects of zero-carbon energy generation are most affected by NEPA?
Anything with a federal nexus that doesn’t include tax credits. Solar and wind that is on federal land is subject to a NEPA review, and anything that is linear infrastructure – transmission often has to go through multiple NEPA reviews. We don’t see a ton of transmission being litigated over on our end, but we think that is a sign NEPA is such a known obstacle that no one even wants to touch a transmission line that’ll go through 14 years of review, so there’s this unknown graveyard of transmission that wasn’t even planned.
In your report, you noted there was a relatively small number of zero-carbon energy projects in your database of NEPA cases. Is solar and wind just being developed more frequently on private land, so there’s less of these sorts of conflicts?
Precisely. The states that are the most powered by wind or create the most wind energy are Texas and Iowa, and those are bypassing the national federal environmental review process [with private land], in addition to not having their own state requirements, so it’s easier to build projects.
What would you tell a solar or wind developer about your research?
This is confirming a lot of things they may have already instinctually known or believed to be true, which is that NEPA and filling out an environmental impact statement takes a really long time and is likely to be litigated over. If you’re a developer who can’t avoid putting your energy project on federal land, you may just want to avoid moving forward with it – the cost may outweigh whatever revenue you could get from that project because you can’t know how much money you’ll have to pour into it.
Huh. Sounds like everything is working well. I do think your work identifies a clear risk in developing on federal lands, which is baked into the marketplace now given the pause on permits for renewables on federal lands.
Yeah. And if you think about where the best places would be to put these technologies? It is on federal lands. The West is way more federal land than anywhere else in the county. Nevada is a great place to put solar — there’s a lot of sun. But we’re not going to put anything there if we can’t put anything there.
What’s the remedy?
We propose a set of policy suggestions. We think the judicial review process could be sped along or not be as burdensome. Our research most obviously points to shortening the statute of limitations under the Administrative Procedures Act from six years to six months, because a great deal of the projects we reviewed made it in that time, so you’d see more cases in good faith as opposed to someone waiting six years waiting to challenge it.
We also think engaging stakeholders much earlier in the process would help.
The Bureau of Land Management says it will be heavily scrutinizing transmission lines if they are expressly necessary to bring solar or wind energy to the power grid.
Since the beginning of July, I’ve been reporting out how the Trump administration has all but halted progress for solar and wind projects on federal lands through a series of orders issued by the Interior Department. But last week, I explained it was unclear whether transmission lines that connect to renewable energy projects would be subject to the permitting freeze. I also identified a major transmission line in Nevada – the north branch of NV Energy’s Greenlink project – as a crucial test case for the future of transmission siting in federal rights-of-way under Trump. Greenlink would cross a litany of federal solar leases and has been promoted as “essential to helping Nevada achieve its de-carbonization goals and increased renewable portfolio standard.”
Well, BLM has now told me Greenlink North will still proceed despite a delay made public shortly after permitting was frozen for renewables, and that the agency still expects to publish the record of decision for the line in September.
This is possible because, as BLM told me, transmission projects that bring solar and wind power to the grid will be subject to heightened scrutiny. In an exclusive statement, BLM press secretary Brian Hires told me via e-mail that a secretarial order choking out solar and wind permitting on federal lands will require “enhanced environmental review for transmission lines only when they are a part of, and necessary for, a wind or solar energy project.”
However, if a transmission project is not expressly tied to wind or solar or is not required for those projects to be constructed… apparently, then it can still get a federal green light. For instance in the case of Greenlink, the project itself is not explicitly tied to any single project, but is kind of like a transmission highway alongside many potential future solar projects. So a power line can get approved if it could one day connect to wind or solar, but the line’s purpose cannot solely be for a wind or solar project.
This is different than, say, lines tied explicitly to connecting a wind or solar project to an existing transmission network. Known as gen-tie lines, these will definitely face hardships with this federal government. This explains why, for example, BLM has yet to approve a gen-tie line for a wind project in Wyoming that would connect the Lucky Star wind project to the grid.
At the same time, it appears projects may be given a wider berth if a line has other reasons for existing, like improving resilience on the existing grid, or can be flexibly used by not just renewables but also fossil energy.
So, the lesson to me is that if you’re trying to build transmission infrastructure across federal property under this administration, you might want to be a little more … vague.
Tech companies, developers, and banks are converging behind “flexible loads.”
Electricity prices are up by over 5% so far this year — more than twice the overall rate of inflation — while utilities have proposed $29 billion worth of rate hikes so far this year, compared to $12 billion last year, according to electricity policy research group PowerLines. At the same time, new data centers are sprouting up everywhere as tech giants try to outpace each other — and their Chinese rivals — in the race to develop ever more advanced (and energy hungry) artificial intelligence systems, with hundreds of billions of dollars of new investments still in the pipeline.
You see the problem here?
In the PJM Interconnection, America’s largest electricity market which includes Virginia’s “data center alley” as part of its 13-state territory, some 30 gigawatts of a projected 32 total gigawatts of load growth through 2030 are expected to come from data centers.
“The onrush of demand has created significant upward pricing pressure and has raised future resource adequacy concerns,” David Mills, the chair of PJM’s board of managers, said in a letter last week announcing the beginning of a process to look into the issues raised by large load interconnection — i.e. getting data centers on the grid without exploding costs for other users of the grid or risking blackouts.
Customers in PJM are paying the price already, as increasingly scarce capacity has translated into upward-spiraling payments to generators, which then show up on retail electricity bills. New large loads can raise costs still further by requiring grid upgrades to accommodate the increased demand for power — costs that get passed down to all ratepayers. PJM alone has announced over $10 billion in transmission upgrades, according to research by Johns Hopkins scholar Abraham Silverman. “These new costs are putting significant upward pressure on customer bills,” Silverman wrote in a report with colleagues Suzanne Glatz and Mahala Lahvis, released in June.
“There’s increasing recognition that the path we’re on right now is not long-term sustainable,” Silverman told me when we spoke this week about the report. “Costs are increasing too fast. The amount of infrastructure we need to build is too much. We need to prioritize, and we need to make this data center expansion affordable for consumers. Right now it’s simply not. You can’t have multi-billion-dollar rate increases year over year.”
While it’s not clear precisely what role existing data center construction has played in electricity bill increases on a nationwide scale, rising electricity rates will likely become a political problem wherever and whenever they do hit, with data centers being the most visible manifestation of the pressures on the grid.
Charles Hua, the founder and executive director of PowerLines, called data centers “arguably the most important topic in energy,” but cautioned that outside of specific demonstrable instances (e.g. in PJM), linking them to utility rate increases can be “a very oversimplified narrative.” The business model for vertically integrated utilities can incentivize them to over-invest in local transmission, Hua pointed out. And even without new data center construction, the necessity of replacing and updating an aging grid would remain.
Still, the connection between large new sources of demand and higher prices is pretty easy to draw: Electricity grids are built to accommodate peak demand, while the bills customers receive are based on a combination of the fixed cost of maintaining the grid for everyone and the cost of the energy itself, therefore higher peak demand and more grid maintenance equals higher bills.
But what if data centers could use the existing transmission and generation system and not add to peak generation? That’s the promise of load flexibility.
If data centers could commit to not requiring power at times of extremely high demand, they could essentially piggyback on existing grid infrastructure. Widely cited research by Tyler Norris, Tim Profeta, Dalia Patino-Echeverri, and Adam Cowie-Haskell of Duke University demonstrated that curtailing large loads for as little as 0.5% of their annual uptime (177 hours of curtailment annually on average, with curtailment typically lasting just over two hours) could allow almost 100 gigawatts of new demand to connect to the grid without requiring extensive, costly upgrades.
The groundswell behind flexibility has rapidly gained institutional credibility. Last week, Google announced that it had reached deals with two utilities, Indiana Michigan Power and the Tennessee Valley Authority, to incorporate flexibility into how their data centers run. The Indiana Michigan Power contract will “allow [Google] to reduce or shift electricity demand to carry out non-urgent tasks during hours when the electric grid is under less stress,” the utility said.
Google has long been an innovator in energy procurement — it famously pioneered the power purchase agreement structure that has helped finance many a renewable energy development — and already has its fingers in many pots when it comes to grid flexibility. The company’s chief scientist, Jeff Dean, is an investor in Emerald AI, a software company that promises to help data centers work flexibly, while its urbanism-focused spinout Sidewalk Infrastructure Partners has backed Verrus, a demand-flexible data center developer.
Hyperscale developers aren’t the only big fish excited about data center flexibility. Financiers are, as well.
Goldman Sachs released a splashy report this week that cited Norris extensively (plus Heatmap). Data center flexibility promises to be a win-win-win, according to Goldman (which, of course, would love to finance an AI boom unhindered by higher retail electricity rates or long interconnection queues for new generation). “What if, thanks to curtailment, instead of overwhelming the grid, AI data centers became the shock absorbers that finally unlocked this stranded capacity?” the report asks.
The holy grail for developers and flexibility is not just saving money on electricity, which is a small cost compared to procuring advanced chips to train and run AI models. The real win would be to build new data centers faster. “Time to market is critical for AI companies,” the Goldman analysts wrote.
But creating a system where data centers can connect to the grid sooner if they promise to be flexible about power consumption would require immense institutional change for states, utilities, regulators, and power markets.
“We really don’t have existing service tiers in place for most jurisdictions that acknowledges and incentivizes flexible loads and plans around them,” Norris told me.
When I talked to Silverman, he told me that integrating flexibility into local decision-making could mean rewriting state utility regulations to allow a special pathway for data centers. It could also involve making local or state tax incentives contingent on flexibility.
Whatever the new structure looks like, the point is to “enshrine a policy that says, ‘data centers are different,’ and we are going to explicitly recognize those differences and tailor rules to data centers,” Silverman said. He pointed specifically to a piece of legislation in New Jersey that he consulted on, which would have utilities and regulators work together to come up with specific rate structures for data centers.
Norris also pointed to a proposal in the Southwest Power Pool, which runs down the spine of the country from the Dakotas to Louisiana, which would allow large loads like data centers to connect to the grid quickly “with the tradeoff of potential curtailment during periods of system stress to protect regional reliability,” the transmission organization said.
And there’s still more legal and regulatory work to be done before hyperscalers can take full advantage of those incentives, Norris told me. Utilities and their data center customers would have to come up with a rate structure that incorporates flexibility and faster interconnection, where more flexibility can allow for quicker timelines.
Speed is of the essence — not just to be able to link up more data centers, but also to avoid a political firestorm around rising electricity rates. There’s already a data center backlash brewing: The city of Tucson earlier this month rejected an Amazon facility in a unanimous city council vote, taken in front of a raucous, cheering crowd. Communities in Indiana, a popular location for data center construction, have rejected several projects.
The drama around PJM may be a test case for the rest of the country. After its 2024 capacity auction jumped came in at $15 billion, up from just over $2 billion the year before, complaints from Pennsylvania Governor Josh Shapiro led to a price cap on future auctions. PJM’s chief executive said in April that he would resign by the end of this year. A few months later, PJM’s next capacity auction hit the price cap.
“You had every major publication writing that AI data centers are causing electricity prices to spike” after the PJM capacity auction, Norris told me. “They lost that public relations battle.”
With more flexibility, there’s a chance for data center developers to tell a more positive story about how they affect the grid.
“It’s not just about avoiding additional costs,” Norris said. “There’s this opportunity that if you can mitigate additional cost, you can put downward cost on rates.” That’s almost putting things generously — data center developers might not have a choice.