Sign In or Create an Account.

By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy

Energy

One Weird Trick for Getting More Data Centers on the Grid

Just turn them off sometimes, according to new research from Duke University.

Wires and power lines.
Heatmap Illustration/Getty Images

Grid planners have entered a new reality. After years of stagnant growth, utilities are forecasting accelerating electricity demand from artificial intelligence and other energy-intense industries and using it to justify building out more natural gas power plants and keep old coal plants online. The new administration has declared that the United States is in an “energy emergency,” bemoaning that the country’s generating capacity is “far too inadequate to meet our Nation’s needs.” Or, as President Trump put it at the Republican National Convention, “AI needs tremendous — literally, twice the electricity that’s available now in our country, can you imagine?”

The same logic also works the other way — the projected needs of data centers and manufacturing landed some power producers among the best performing stocks of 2024. And when it looked like artificial intelligence might not be as energy intensive as those producers assumed thanks to the efficiency of DeepSeek’s open source models, shares in companies that own power plants and build gas turbines crashed.

Both industry and policymakers seem convinced that the addition of new, large sources of power demand must be met with more generation and expensive investments to upgrade the grid.

But what if it doesn’t?

That’s the question Tyler Norris, Tim Profeta, Dalia Patino-Echeverri, and Adam Cowie-Haskell of the Nicholas Institute of Energy, Environment and Stability at Duke University tried to answer in a paper released Tuesday.

Their core finding: that the United States could add 76 gigawatts of new load — about a tenth of the peak electricity demand across the whole country — without having to upgrade the electrical system or add new generation. There’s just one catch: Those new loads must be “curtailed” (i.e. not powered) for up to one-quarter-of-one-percent of their maximum time online. That’s it — that’s the whole catch.

“We were very surprised,” Norris told me, referring to the amount of power freed up by data centers if they could curtail their usage at high usage times.

“It goes against the grain of the current paradigm,” he said, “that we have no headroom, and that we have to make massive expansion of the system to accommodate new load and generation.”

The electricity grid is built to accommodate the peak demand of the system, which often occurs during the hottest days of summer or the coldest days of winter. That means much grid infrastructure is built out solely to accommodate power demand that occurs over just a few days of the year, and even then for only part of those days. Thus it follows that if those peaks can be shaved by demand being reduced, then the existing grid can accommodate much more new demand.

This is the logic of longstanding “demand response” programs, whether they involve retail consumers agreeing not to adjust their thermostats outside a certain range or factories shuttering for prescribed time periods in exchange for payments from the grid authority. In very flexible markets, such as Texas’ ERCOT, some data center customers (namely cryptominers) get a substantial portion of their overall revenue by agreeing to curtail their use of electricity during times of grid stress.

While Norris cautioned that readers of the report shouldn’t think this means we won’t need any new grid capacity, he argued that the analysis “can enable more focus of limited resources on the most valuable upgrades to the system.”

Instead of focusing on expensive upgrades needed to accommodate the new demand on the grid, the Duke researchers asked what new sources of demand could do for the grid as a whole. Ask not what the grid can do for you, ask what you can do for the grid.

“By strategically timing or curtailing demand, these flexible loads can minimize their impact on peak periods,” they write. “In doing so, they help existing customers by improving the overall utilization rate — thereby lowering the per-unit cost of electricity — and reduce the likelihood that expensive new peaking plants or network expansions may be needed.” urtailment of large loads, they argue, can make the grid more efficient by utilizing existing equipment more fully and avoiding expensive upgrades that all users might have to pay for.

They found that when new large loads are curtailed for up to 0.25% of their maximum uptime, the average time offline amounts to just over an hour-and-a-half at a go, with 85 hours of load curtailment per year on average.

“You’re able to add incremental load to accept flexibility in most stressed periods,” Norris said. “Most hours of the year we’re not that close to the maximum peaks.”

In the nation’s largest electricity trading market, PJM Interconnection, this quarter-percent of total uptime curtailment would enable the grid to bring online over 13 gigawatts of new data centers — about the capacity of 13 new, large nuclear reactors — while maintaining PJM’s planners’ desired amount of generation capacity. In other words, that’s up to 13 gigawatts of reactors PJM no longer has to build, as long as that new load can be curtailed for 0.25% of its maximum uptime.

But why would data center developers agree to go offline when demand for electricity rises?

It’s not just because it could help the developers maintain their imperiled sustainability goals. It also presents an opportunity to solve the hardest problem for building out new data centers. One of the key limiting factors to getting data centers online is so-called “time to power,” i.e. how long it takes for the grid to be upgraded, either with new transmission equipment or generation, so that a data center can get up and running. According to estimates from the consulting firm McKinsey, a data center project can be developed in as little as a year and a half — but only if there’s already power available. Otherwise the timeline can run several years.

“There’s a clear value add,” Norris said. There are “very few locations to interconnect multi-hundred megawatt or gigawatt load in near-term fashion. If they accept flexibility for provision interim period, that allows them to get online more quickly.”

This “time to power” problem has motivated a flowering of unconventional ideas to power data centers, whether it’s large-scale deployment of on-site solar power (with some gas turbines) in the Southwest, renewables adjacent to data centers, co-located natural gas, or buying whole existing nuclear power plants.

But there may be a far simpler answer.

Yellow

You’re out of free articles.

Subscribe today to experience Heatmap’s expert analysis 
of climate change, clean energy, and sustainability.
To continue reading
Create a free account or sign in to unlock more free articles.
or
Please enter an email address
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Spotlight

Trump’s Renewables Permitting Thaw Is Also a Legal Strategy

The administration has begun shuffling projects forward as court challenges against the freeze heat up.

Solar panels and Donald Trump.
Heatmap Illustration/Getty Images

The Trump administration really wants you to think it’s thawing the freeze on renewable energy projects. Whether this is a genuine face turn or a play to curry favor with the courts and Congress, however, is less clear.

In the face of pressures such as surging energy demand from artificial intelligence and lobbying from prominent figures on the right, including the wife of Trump’s deputy chief of staff, the Bureau of Land Management has unlocked environmental permitting processes in recent weeks for a substantial number of renewable energy projects. Public documents, media reports, and official agency correspondence with stakeholders on the ground all show projects that had ground to a halt now lurching forward.

Keep reading...Show less
Yellow
AM Briefing

Nuclear Beginnings

On lithium demand, coal, and compressed air energy storage

A TerraPower facility.
Heatmap Illustration/Getty Images

Current conditions: May-like warmth is sending temperatures across the Midwest and Northeast up to 25 degrees Fahrenheit above historical averages • Dangerous rip currents are yanking at Florida’s Atlantic coast • South Africa’s Northern Cape is bracing for what’s locally known as an orange-level 5 storm bringing intense flooding.

THE TOP FIVE

1. NRC gives Bill Gates’ nuclear startup the green light on construction

The Nuclear Regulatory Commission granted a construction permit for the Bill Gates-backed small modular reactor startup TerraPower’s flagship project to convert an old coal plant in Kemmerer, Wyoming, to a next-generation nuclear station. The approval marked the first time a commercial-scale fourth-generation nuclear reactor — the TerraPower design uses liquid sodium metal as a coolant instead of water, as all other commercial reactors in the United States use — has received the green light from regulators this century. “Today is a historic day for the United States’ nuclear industry,” Chris Levesque, TerraPower’s chief executive, said in a statement. “We are beyond proud to receive a positive vote from the Nuclear Regulatory Commissioners to grant us our construction permit for Kemmerer Unit One.”

Keep reading...Show less
Green
Climate

Careful With That Wild-Caught Tuna

The Trump administration’s rollback of coal plant emissions standards means that mercury is on the menu again.

A skull and a tuna.
Heatmap Illustration/Getty Images

It started with the cats. In the seaside town of Minamata, on the west coast of the most southerly of Japan’s main islands, Kyushu, the cats seemed to have gone mad — convulsing, twirling, drooling, and even jumping into the ocean in what looked like suicides. Locals started referring to “dancing cat fever.” Then the symptoms began to appear in their newborns and children.

Now, nearly 70 years later, Minimata is a cautionary tale of industrial greed and its consequences. Dancing cat fever and “Minamata disease” were both the outward effects of severe mercury poisoning, caused by a local chemical company dumping methylmercury waste into the local bay. Between the first recognized case in 1956 and 2001, more than 2,200 people were recognized as victims of the pollution, which entered the population through their seafood-heavy diets. Mercury is a bioaccumulator, meaning it builds up in the tissues of organisms as it moves up the food chain from contaminated water to shellfish to small fish to apex predators: Tuna. Cats. People.

Keep reading...Show less
Blue