You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
Just turn them off sometimes, according to new research from Duke University.

Grid planners have entered a new reality. After years of stagnant growth, utilities are forecasting accelerating electricity demand from artificial intelligence and other energy-intense industries and using it to justify building out more natural gas power plants and keep old coal plants online. The new administration has declared that the United States is in an “energy emergency,” bemoaning that the country’s generating capacity is “far too inadequate to meet our Nation’s needs.” Or, as President Trump put it at the Republican National Convention, “AI needs tremendous — literally, twice the electricity that’s available now in our country, can you imagine?”
The same logic also works the other way — the projected needs of data centers and manufacturing landed some power producers among the best performing stocks of 2024. And when it looked like artificial intelligence might not be as energy intensive as those producers assumed thanks to the efficiency of DeepSeek’s open source models, shares in companies that own power plants and build gas turbines crashed.
Both industry and policymakers seem convinced that the addition of new, large sources of power demand must be met with more generation and expensive investments to upgrade the grid.
But what if it doesn’t?
That’s the question Tyler Norris, Tim Profeta, Dalia Patino-Echeverri, and Adam Cowie-Haskell of the Nicholas Institute of Energy, Environment and Stability at Duke University tried to answer in a paper released Tuesday.
Their core finding: that the United States could add 76 gigawatts of new load — about a tenth of the peak electricity demand across the whole country — without having to upgrade the electrical system or add new generation. There’s just one catch: Those new loads must be “curtailed” (i.e. not powered) for up to one-quarter-of-one-percent of their maximum time online. That’s it — that’s the whole catch.
“We were very surprised,” Norris told me, referring to the amount of power freed up by data centers if they could curtail their usage at high usage times.
“It goes against the grain of the current paradigm,” he said, “that we have no headroom, and that we have to make massive expansion of the system to accommodate new load and generation.”
The electricity grid is built to accommodate the peak demand of the system, which often occurs during the hottest days of summer or the coldest days of winter. That means much grid infrastructure is built out solely to accommodate power demand that occurs over just a few days of the year, and even then for only part of those days. Thus it follows that if those peaks can be shaved by demand being reduced, then the existing grid can accommodate much more new demand.
This is the logic of longstanding “demand response” programs, whether they involve retail consumers agreeing not to adjust their thermostats outside a certain range or factories shuttering for prescribed time periods in exchange for payments from the grid authority. In very flexible markets, such as Texas’ ERCOT, some data center customers (namely cryptominers) get a substantial portion of their overall revenue by agreeing to curtail their use of electricity during times of grid stress.
While Norris cautioned that readers of the report shouldn’t think this means we won’t need any new grid capacity, he argued that the analysis “can enable more focus of limited resources on the most valuable upgrades to the system.”
Instead of focusing on expensive upgrades needed to accommodate the new demand on the grid, the Duke researchers asked what new sources of demand could do for the grid as a whole. Ask not what the grid can do for you, ask what you can do for the grid.
“By strategically timing or curtailing demand, these flexible loads can minimize their impact on peak periods,” they write. “In doing so, they help existing customers by improving the overall utilization rate — thereby lowering the per-unit cost of electricity — and reduce the likelihood that expensive new peaking plants or network expansions may be needed.” urtailment of large loads, they argue, can make the grid more efficient by utilizing existing equipment more fully and avoiding expensive upgrades that all users might have to pay for.
They found that when new large loads are curtailed for up to 0.25% of their maximum uptime, the average time offline amounts to just over an hour-and-a-half at a go, with 85 hours of load curtailment per year on average.
“You’re able to add incremental load to accept flexibility in most stressed periods,” Norris said. “Most hours of the year we’re not that close to the maximum peaks.”
In the nation’s largest electricity trading market, PJM Interconnection, this quarter-percent of total uptime curtailment would enable the grid to bring online over 13 gigawatts of new data centers — about the capacity of 13 new, large nuclear reactors — while maintaining PJM’s planners’ desired amount of generation capacity. In other words, that’s up to 13 gigawatts of reactors PJM no longer has to build, as long as that new load can be curtailed for 0.25% of its maximum uptime.
But why would data center developers agree to go offline when demand for electricity rises?
It’s not just because it could help the developers maintain their imperiled sustainability goals. It also presents an opportunity to solve the hardest problem for building out new data centers. One of the key limiting factors to getting data centers online is so-called “time to power,” i.e. how long it takes for the grid to be upgraded, either with new transmission equipment or generation, so that a data center can get up and running. According to estimates from the consulting firm McKinsey, a data center project can be developed in as little as a year and a half — but only if there’s already power available. Otherwise the timeline can run several years.
“There’s a clear value add,” Norris said. There are “very few locations to interconnect multi-hundred megawatt or gigawatt load in near-term fashion. If they accept flexibility for provision interim period, that allows them to get online more quickly.”
This “time to power” problem has motivated a flowering of unconventional ideas to power data centers, whether it’s large-scale deployment of on-site solar power (with some gas turbines) in the Southwest, renewables adjacent to data centers, co-located natural gas, or buying whole existing nuclear power plants.
But there may be a far simpler answer.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
The fourth-generation gas-cooled reactor company ZettaJoule is setting up shop at an unnamed university.
The appeal of next-generation nuclear technology is simple. Unlike the vast majority of existing reactors that use water, so-called fourth-generation units use coolants such as molten salt, liquid metal, or gases that can withstand intense heat such as helium. That allows the machines to reach and maintain the high temperatures necessary to decarbonize industrial processes, which currently only fossil fuels are able to reach.
But the execution requirements of these advanced reactors are complex, making skepticism easy to understand. While the U.S., Germany, and other countries experimented with fourth-generation reactors in earlier decades, there is only one commercial unit in operation today. That’s in China, arguably the leader in advanced nuclear, which hooked up a demonstration model of a high-temperature gas-cooled reactor to its grid two years ago, and just approved building another project in September.
Then there’s Japan, which has been operating its own high-temperature gas-cooled reactor for 27 years at a government research site in Ibaraki Prefecture, about 90 minutes north of Tokyo by train. Unlike China’s design, it’s not a commercial power reactor. Also unlike China’s design, it’s coming to America.
Heatmap has learned that ZettaJoule, an American-Japanese startup led by engineers who worked on that reactor, is now coming out of stealth and laying plans to build its first plant in Texas.
For months, the company has quietly staffed up its team of American and Japanese executives, including a former U.S. Nuclear Regulatory Commission official and a high-ranking ex-administrator from the industrial giant Mitsubishi. It’s now preparing to decamp from its initial home base in Rockville, Maryland, to the Lone Star State as it prepares to announce its debut project at an as-yet-unnamed university in Texas.
“We haven’t built a nuclear reactor in many, many decades, so you have only a handful of people who experienced the full cycle from design to operations,” Mitsuo Shimofuji, ZettaJoule’s chief executive, told me. “We need to complete this before they retire.”
That’s where the company sees its advantage over rivals in the race to build the West’s first commercial high-temperature gas reactor, such as Amazon-backed X-energy or Canada’s StarCore nuclear. ZettaJoule’s chief nuclear office, Kazuhiko Kunitomi, oversaw the construction of Japan’s research reactor in the 1990s. He’s considered Japan’s leading expert in high-temperature gas reactors.
“Our chief nuclear officer and some of our engineers are the only people in the Western world who have experience of the whole cycle from design to construction to operation of a high temperature gas reactor,” Shimofuji said.
Like X-energy’s reactor, ZettaJoule’s design is a small modular reactor. With a capacity of 30 megawatts of thermal output and 12 megawatts of electricity, the ZettaJoule reactor qualifies as a microreactor, a subcategory of SMR that includes anything 20 megawatts of electricity or less. Both companies’ reactors will also run on TRISO, a special kind of enriched uranium with cladding on each pellet that makes the fuel safer and more efficient at higher temperatures.
While X-energy’s debut project that Amazon is financing in Washington State is a nearly 1-gigawatt power station made up of at least a dozen of the American startup’s 80-megawatt reactors, ZettaJoule isn’t looking to generate electricity.
The first new reactor in Texas will be a research reactor, but the company’s focus is on producing heat. The reactor already working in Japan, which produces heat, demonstrates that the design can reach 950 degrees Celsius, roughly 25% higher than the operating temperature of China’s reactor.
The potential for use in industrial applications has begun to attract corporate partners. In a letter sent Monday to Ted Garrish, the U.S. assistant secretary of energy in charge of nuclear power — a copy of which I obtained — the U.S. subsidiary of the Saudi Arabian oil goliath Aramco urged the Trump administration to support ZettaJoule, and said that it would “consider their application to our operations” as the technology matures. ZettaJoule is in talks with at least two other multinational corporations.
The first new reactor ZettaJoule builds won’t be identical to the unit in Japan, Shimofuji said.
“We are going to modernize this reactor together with the Japanese and U.S. engineering partners,” he said. “The research reactor is robust and solid, but it’s over-engineered. What we want to do is use the safety basis but to make it more economic and competitive.”
Once ZettaJoule proves its ability to build and operate a new unit in Texas, the company will start exporting the technology back to Japan. The microreactor will be its first product line.
“But in the future, we can scale up to 20 times bigger,” Shimofuji said. “We can do 600 megawatts thermal and 300 megawatts electric.”
Another benefit ZettaJoule can tap into is the sweeping deal President Donald Trump brokered with Japanese Prime Minister Sanae Takaichi in October, which included hundreds of billions of dollars for new reactors of varying sizes, including the large-scale Westinghouse AP1000. That included financing to build GE Vernova Hitachi Nuclear Energy’s 300-megawatt BWRX-300, one of the West’s leading third-generation SMRs, which uses a traditional water-cooled design.
Unlike that unit, however, ZettaJoule’s micro-reactor is not a first-of-a-kind technology, said Chris Gadomski, the lead nuclear analyst at the consultancy BloombergNEF.
“It’s operated in Japan for a long, long time,” he told me. “So that second-of-a-kind is an attractive feature. Some of these companies have never operated a reactor. This one has done that.”
A similar dynamic almost played out with large-scale reactors more than two decades ago. In the late 1990s, Japanese developers built four of GE and Hitachi’s ABWR reactor, a large-scale unit with some of the key safety features that make the AP1000 stand out compared to its first- and second-generation predecessors. In the mid 2000s, the U.S. certified the design and planned to build a pair in South Texas. But the project never materialized, and America instead put its resources into Westinghouse’s design.
But the market is different today. Electricity demand is surging in the near term from data centers and in the long term from electrification of cars and industry. The need to curb fossil fuel consumption in the face of worsening climate change is more widely accepted than ever. And China’s growing dominance over nuclear energy has rattled officials from Tokyo to Washington.
“We need to deploy this as soon as possible to not lose the experienced people in Japan and the U.S.,” Shimofuji said. “In two or three years time, we will get a construction permit ideally. We are targeting the early 2030s.”
If every company publicly holding itself to that timeline is successful, the nuclear industry will be a crowded field. But as history shows, those with the experience to actually take a reactor from paper to concrete may have an advantage.
It’s now clear that 2026 will be big for American energy, but it’s going to be incredibly tense.
Over the past 365 days, we at The Fight have closely monitored numerous conflicts over siting and permitting for renewable energy and battery storage projects. As we’ve done so, the data center boom has come into full view, igniting a tinderbox of resentment over land use, local governance and, well, lots more. The future of the U.S. economy and the energy grid may well ride on the outcomes of the very same city council and board of commissioners meetings I’ve been reporting on every day. It’s a scary yet exciting prospect.
To bring us into the new year, I wanted to try something a little different. Readers ask me all the time for advice with questions like, What should I be thinking about right now? And, How do I get this community to support my project? Or my favorite: When will people finally just shut up and let us build things? To try and answer these questions and more, I wanted to give you the top five trends in energy development (and data centers) I’ll be watching next year.
The best thing going for American renewable energy right now is the AI data center boom. But the backlash against developing these projects is spreading incredibly fast.
Do you remember last week when I told you about a national environmental group calling for data center moratoria across the country? On Wednesday, Senator Bernie Sanders called for a nationwide halt to data center construction until regulations are put in place. The next day, the Working Families Party – a progressive third party that fields candidates all over the country for all levels of government – called for its candidates to run in opposition to new data center construction.
On the other end of the political spectrum, major figures in the American right wing have become AI skeptics critical of the nascent data center buildout, including Florida Governor Ron DeSantis, Missouri Senator Josh Hawley, and former Trump adviser Steve Bannon. These figures are clearly following the signals amidst the noise; I have watched in recent months as anti-data center fervor has spread across Facebook, with local community pages and groups once focused on solar and wind projects pivoting instead to focus on data centers in development near them.
In other words, I predicted just one month ago, an anti-data center political movement is forming across the country and quickly gaining steam (ironically aided by the internet and algorithms powered by server farms).
I often hear from the clean energy sector that the data center boom will be a boon for new projects. Renewable energy is the fastest to scale and construct, the thinking goes, and therefore will be the quickest, easiest, and most cost effective way to meet the projected spike in energy demand.
I’m not convinced yet that this line of thinking is correct. But I’m definitely sure that no matter the fuel type, we can expect a lot more transmission development, and nothing sparks a land use fight more easily than new wires.
Past is prologue here. One must look no further than the years-long fight over the Piedmont Reliability Project, a proposed line that would connect a nuclear power plant in Pennsylvania to data centers in Virginia by crossing a large swathe of Maryland agricultural land. I’ve been covering it closely since we put the project in our inaugural list of the most at-risk projects, and the conflict is now a clear blueprint.
In Wisconsin, a billion-dollar transmission project is proving this thesis true. I highly recommend readers pay close attention to Port Washington, where the release of fresh transmission line routes for a massive new data center this week has aided an effort to recall the city’s mayor for supporting the project. And this isn’t even an interstate project like Piedmont.
While I may not be sure of the renewable energy sector’s longer-term benefits from data center development, I’m far more confident that this Big Tech land use backlash is hitting projects right now.
The short-term issue for renewables developers is that opponents of data centers use arguments and tactics similar to those deployed by anti-solar and anti-wind advocates. Everyone fighting data centers is talking about ending development on farmland, avoiding changes to property values, stopping excess noise and water use, and halting irreparable changes to their ways of life.
Only one factor distinguishes data center fights from renewable energy fights: building the former potentially raises energy bills, while the latter will lower energy costs.
I do fear that as data center fights intensify nationwide, communities will not ban or hyper-regulate the server farms in particular, but rather will pass general bans that also block the energy projects that could potentially power them. Rural counties are already enacting moratoria on solar and wind in tandem with data centers – this is not new. But the problem will worsen as conflicts spread, and it will be incumbent upon the myriad environmentalists boosting data center opponents to not accidentally aid those fighting zero-carbon energy.
This week, the Bureau of Land Management approved its first solar project in months: the Libra facility in Nevada. When this happened, I received a flood of enthusiastic and optimistic emails and texts from sources.
We do not yet know whether the Libra approval is a signal of a thaw inside the Trump administration. The Interior Department’s freeze on renewables permitting decisions continues mostly unabated, and I have seen nothing to indicate that more decisions like this are coming down the pike. What we do know is that ahead of a difficult midterm election, the Trump administration faces outsized pressure to do more to address “affordability,” Democrats plan to go after Republicans for effectively repealing the Inflation Reduction Act and halting permits for solar and wind projects, and there’s a grand bargain to be made in Congress over permitting reform that rides on an end to the permitting freeze.
I anticipate that ahead of the election and further permitting talks in Congress, the Trump administration will mildly ease its chokehold on solar and wind permits because that is the most logical option in front of them. I do not think this will change the circumstances for more than a small handful of projects sited on federal lands that were already deep in the permitting process when Trump took power.
It’s impossible to conclude a conversation about next year’s project fights without ending on the theme that defined 2025: battery fire fears are ablaze, and they’ll only intensify as data centers demand excess energy storage capacity.
The January Moss Landing fire incident was a defining moment for an energy sector struggling to grapple with the effects of the Internet age. Despite bearing little resemblance to the litany of BESS proposals across the country, that one hunk of burning battery wreckage in California inspired countless communities nationwide to ban new battery storage outright.
There is no sign this trend will end any time soon. I expect data centers to only accelerate these concerns, as these facilities can also catch fire in ways that are challenging to address.
Plus a resolution for Vineyard Wind and more of the week’s big renewables fights.
1. Hopkins County, Texas – A Dallas-area data center fight pitting developer Vistra against Texas attorney general Ken Paxton has exploded into a full-blown political controversy as the power company now argues the project’s developer had an improper romance with a city official for the host community.
2. La Plata County, Colorado – This county has just voted to extend its moratorium on battery energy storage facilities over fire fears.
3. Dane County, Wisconsin – The city of Madison appears poised to ban data centers for at least a year.
4. Goodhue County, Minnesota – The Minnesota Center for Environmental Advocacy, a large environmentalist organization in the state, is suing to block a data center project in the small city of Pine Island.
5. Hall County, Georgia – A data center has been stopped down South, at least for now.
6. Dukes County, Massachusetts – The fight between Vineyard Wind and the town of Nantucket seems to be over.