You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
If it turns out to be a bubble, billions of dollars of energy assets will be on the line.

The data center investment boom has already transformed the American economy. It is now poised to transform the American energy system.
Hyperscalers — including tech giants such as Microsoft and Meta, as well as leaders in artificial intelligence like OpenAI and CoreWeave — are investing eyewatering amounts of capital into developing new energy resources to feed their power-hungry data infrastructure. Those data centers are already straining the existing energy grid, prompting widespread political anxiety over an energy supply crisis and a ratepayer affordability shock. Nothing in recent memory has thrown policymakers’ decades-long underinvestment in the health of our energy grid into such stark relief. The commercial potential of next-generation energy technologies such as advanced nuclear, batteries, and grid-enhancing applications now hinge on the speed and scale of the AI buildout.
But what happens if the AI boom buffers and data center investment collapses? It is not idle speculation to say that the AI boom rests on unstable financial foundations. Worse, however, is the fact that as of this year, the tech sector’s breakneck investment into data centers is the only tailwind to U.S. economic growth. If there is a market correction, there is no other growth sector that could pick up the slack.
Not only would a sudden reversal in investor sentiment make stranded assets of the data centers themselves, which will lose value as their lease revenue disappears, it also threatens to strand all the energy projects and efficiency innovations that data center demand might have called forth.
If the AI boom does not deliver, we need a backup plan for energy policy.
An analysis of the capital structure of the AI boom suggests that policymakers should be more concerned about the financial fundamentals of data centers and their tenants — the tech companies that are buoying the economy. My recent report for the Center for Public Enterprise, Bubble or Nothing, maps out how the various market actors in the AI sector interact, connecting the market structure of the AI inference sector to the economics of Nvidia’s graphics processing units, the chips known as GPUs that power AI software, to the data center real estate debt market. Spelling out the core financial relationships illuminates where the vulnerabilities lie.

First and foremost: The business model remains unprofitable. The leading AI companies ― mostly the leading tech companies, as well as some AI-specific firms such as OpenAI and Anthropic ― are all competing with each other to dominate the market for AI inference services such as large language models. None of them is returning a profit on its investments. Back-of-the-envelope math suggests that Meta, Google, Microsoft, and Amazon invested over $560 billion into AI technology and data centers through 2024 and 2025, and have reported revenues of just $35 billion.
To be sure, many new technology companies remain unprofitable for years ― including now-ubiquitous firms like Uber and Amazon. Profits are not the AI sector’s immediate goal; the sector’s high valuations reflect investors’ assumptions about future earnings potential. But while the losses pile up, the market leaders are all vying to maximize the market share of their virtually identical services ― a prisoner’s dilemma of sorts that forces down prices even as the cost of providing inference services continues to rise. Rising costs, suppressed revenues, and fuzzy measurements of real user demand are, when combined, a toxic cocktail and a reflection of the sector’s inherent uncertainty.
Second: AI companies have a capital investment problem. These are not pure software companies; to provide their inference services, AI companies must all invest in or find ways to access GPUs. In mature industries, capital assets have predictable valuations that their owners can borrow against and use as collateral to invest further in their businesses. Not here: The market value of a GPU is incredibly uncertain and, at least currently, remains suppressed due to the sector’s competitive market structure, the physical deterioration of GPUs at high utilization rates, the unclear trajectory of demand, and the value destruction that comes from Nvidia’s now-yearly release of new high-end GPU models.
The tech industry’s rush to invest in new GPUs means existing GPUs lose market value much faster. Some companies, particularly the vulnerable and debt-saddled “neocloud” companies that buy GPUs to rent their compute capacity to retail and hyperscaler consumers, are taking out tens of billions of dollars of loans to buy new GPUs backed by the value of their older GPU stock; the danger of this strategy is obvious. Others including OpenAI and xAI, having realized that GPUs are not safe to hold on one’s balance sheet, are instead renting them from Oracle and Nvidia, respectively.
To paper over the valuation uncertainty of the GPUs they do own, all the hyperscalers have changed their accounting standards for GPU valuations over the past few years to minimize their annual reported depreciation expenses. Some financial analysts don’t buy it: Last year, Barclays analysts judged GPU depreciation as risky enough to merit marking down the earnings estimates of Google (in this case its parent company, Alphabet), Microsoft, and Meta as much as 10%, arguing that consensus modeling was severely underestimating the earnings write-offs required.
Under these market dynamics, the booming demand for high-end chips looks less like a reflection of healthy growth for the tech sector and more like a scramble for high-value collateral to maintain market position among a set of firms with limited product differentiation. If high demand projections for AI technologies come true, collateral ostensibly depreciates at a manageable pace as older GPUs retain their marketable value over their useful life — but otherwise, this combination of structurally compressed profits and rapidly depreciating collateral is evidence of a snake eating its own tail.
All of these hyperscalers are tenants within data centers. Their lack of cash flow or good collateral should have their landlords worried about “tenant churn,” given the risk that many data center tenants will have to undertake multiple cycles of expensive capital expenditure on GPUs and network infrastructure within a single lease term. Data center developers take out construction (or “mini-perm”) loans of four to six years and refinance them into longer-term permanent loans, which can then be packaged into asset-backed and commercial mortgage-backed securities to sell to a wider pool of institutional investors and banks. The threat of broken leases and tenant vacancies threatens the long-term solvency of the leading data center developers ― companies like Equinix and Digital Realty ― as well as the livelihoods of the construction contractors and electricians they hire to build their facilities and manage their energy resources.
Much ink has already been spilled on how the hyperscalers are “roundabouting” each other, or engaging in circular financing: They are making billions of dollars of long-term purchase commitments, equity investments, and project co-development agreements with one another. OpenAI, Oracle, CoreWeave, and Nvidia are at the center of this web. Nvidia has invested $100 billion in OpenAI, to be repaid over time through OpenAI’s lease of Nvidia GPUs. Oracle is spending $40 billion on Nvidia GPUs to power a data center it has leased for 15 years to support OpenAI, for which OpenAI is paying Oracle $300 billion over the next five years. OpenAI is paying CoreWeave over the next five years to rent its Nvidia GPUs; the contract is valued at $11.9 billion, and OpenAI has committed to spending at least $4 billion through April 2029. OpenAI already has a $350 million equity stake in CoreWeave. Nvidia has committed to buying CoreWeave’s unsold cloud computing capacity by 2032 for $6.3 billion, after it already took a 7% stake in CoreWeave when the latter went public. If you’re feeling dizzy, count yourself lucky: These deals represent only a fraction of the available examples of circular financing.
These companies are all betting on each others’ growth; their growth projections and purchase commitments are all dependent on their peers’ growth projections and purchase commitments. Optimistically, this roundabouting represents a kind of “risk mutualism,” which, at least for now, ends up supporting greater capital expenditures. Pessimistically, roundabouting is a way for these companies to pay each other for goods and services in any way except cash — shares, warrants, purchase commitments, token reservations, backstop commitments, and accounts receivable, but not U.S. dollars. The second any one of these companies decides it wants cash rather than a commitment is when the music stops. Chances are, that company needs cash to pay a commitment of its own, likely involving a lender.
Lenders are the final piece of the puzzle. Contrary to the notion that cash-rich hyperscalers can finance their own data center buildout, there has been a record volume of debt issuance this year from companies such as Oracle and CoreWeave, as well as private credit giants like Blue Owl and Apollo, which are lending into the boom. The debt may not go directly onto hyperscalers’ balance sheets, but their purchase commitments are the collateral against which data center developers, neocloud companies like CoreWeave, and private credit firms raise capital. While debt is not inherently something to shy away from ― it’s how infrastructure gets built ― it’s worth raising eyebrows at the role private credit firms are playing at the center of this revenue-free investment boom. They are exposed to GPU financing and to data center financing, although not the GPU producers themselves. They have capped upside and unlimited downside. If they stop lending, the rest of the sector’s risks look a lot more risky.

A market correction starts when any one of the AI companies can’t scrounge up the cash to meet its liabilities and can no longer keep borrowing money to delay paying for its leases and its debts. A sudden stop in lending to any of these companies would be a big deal ― it would force AI companies to sell their assets, particularly GPUs, into a potentially adverse market in order to meet refinancing deadlines. A fire sale of GPUs hurts not just the long-term earnings potential of the AI companies themselves, but also producers such as Nvidia and AMD, since even they would be selling their GPUs into a soft market.
For the tech industry, the likely outcome of a market correction is consolidation. Any widespread defaults among AI-related businesses and special purpose vehicles will leave capital assets like GPUs and energy technologies like supercapacitors stranded, losing their market value in the absence of demand ― the perfect targets for a rollup. Indeed, it stands to reason that the tech giants’ dominance over the cloud and web services sectors, not to mention advertising, will allow them to continue leading the market. They can regain monopolistic control over the remaining consumer demand in the AI services sector; their access to more certain cash flows eases their leverage constraints over the longer term as the economy recovers.
A market correction, then, is hardly the end of the tech industry ― but it still leaves a lot of data center investments stranded. What does that mean for the energy buildout that data centers are directly and indirectly financing?
A market correction would likely compel vertically integrated utilities to cancel plans to develop new combined-cycle gas turbines and expensive clean firm resources such as nuclear energy. Developers on wholesale markets have it worse: It’s not clear how new and expensive firm resources compete if demand shrinks. Grid managers would have to call up more expensive units less frequently. Doing so would constrain the revenue-generating potential of those generators relative to the resources that can meet marginal load more cheaply — namely solar, storage, peaker gas, and demand-response systems. Combined-cycle gas turbines co-located with data centers might be stranded; at the very least, they wouldn’t be used very often. (Peaker gas plants, used to manage load fluctuation, might still get built over the medium term.) And the flight to quality and flexibility would consign coal power back to its own ash heaps. Ultimately, a market correction does not change the broader trend toward electrification.
A market correction that stabilizes the data center investment trajectory would make it easier for utilities to conduct integrated resource planning. But it would not necessarily simplify grid planners’ ability to plan their interconnection queues — phantom projects dropping out of the queue requires grid planners to redo all their studies. Regardless of the health of the investment boom, we still need to reform our grid interconnection processes.
The biggest risk is that ratepayers will be on the hook for assets that sit underutilized in the absence of tech companies’ large load requirements, especially those served by utilities that might be building power in advance of committed contracts with large load customers like data center developers. The energy assets they build might remain useful for grid stability and could still participate in capacity markets. But generation assets built close to data center sites to serve those sites cheaply might not be able to provision the broader energy grid cost-efficiently due to higher grid transport costs incurred when serving more distant sources of load.
These energy projects need not be albatrosses.
Many of these data centers being planned are in the process of securing permits and grid interconnection rights. Those interconnection rights are scarce and valuable; if a data center gets stranded, policymakers should consider purchasing those rights and incentivizing new businesses or manufacturing industries to build on that land and take advantage of those rights. Doing so would provide offtake for nearby energy assets and avoid displacing their costs onto other ratepayers. That being said, new users of that land may not be able to pay anywhere near as much as hyperscalers could for interconnection or for power. Policymakers seeking to capture value from stranded interconnection points must ensure that new projects pencil out at a lower price point.
Policymakers should also consider backstopping the development of critical and innovative energy projects and the firms contracted to build them. I mean this in the most expansive way possible: Policymakers should not just backstop the completion of the solar and storage assets built to serve new load, but also provide exigent purchase guarantees to the firms that are prototyping the flow batteries, supercapacitors, cooling systems, and uninterruptible power systems that data center developers are increasingly interested in. Without these interventions, a market correction would otherwise destroy the value of many of those projects and the earnings potential of their developers, to say nothing of arresting progress on incredibly promising and commercializable technologies.
Policymakers can capture long-term value for the taxpayer by making investments in these distressed projects and developers. This is already what the New York Power Authority has done by taking ownership and backstopping the development of over 7 gigawatts of energy projects ― most of which were at risk of being abandoned by a private sponsor.
The market might not immediately welcome risky bets like these. It is unclear, for instance, what industries could use the interconnection or energy provided to a stranded gigawatt-scale data center. Some of the more promising options ― take aluminum or green steel ― do not have a viable domestic market. Policy uncertainty, tariffs, and tax credit changes in the One Big Beautiful Bill Act have all suppressed the growth of clean manufacturing and metals refining industries like these. The rest of the economy is also deteriorating. The fact that the data center boom is threatened by, at its core, a lack of consumer demand and the resulting unstable investment pathways is itself an ironic miniature of the U.S. economy as a whole.
As analysts at Employ America put it, “The losses in a [tech sector] bust will simply be too large and swift to be neatly offset by an imminent and symmetric boom elsewhere. Even as housing and consumer durables ultimately did well following the bust of the 90s tech boom, there was a one- to two-year lag, as it took time for long-term rates to fall and investors to shift their focus.” This is the issue with having only one growth sector in the economy. And without a more holistic industrial policy, we cannot spur any others.
Questions like these ― questions about what comes next ― suggest that the messy details of data center project finance should not be the sole purview of investors. After all, our exposure to the sector only grows more concentrated by the day. More precisely mapping out how capital flows through the sector should help financial policymakers and industrial policy thinkers understand the risks of a market correction. Political leaders should be prepared to tackle the downside distributional challenges raised by the instability of this data center boom ― challenges to consumer wealth, public budgets, and our energy system.
This sparkling sector is no replacement for industrial policy and macroeconomic investment conditions that create broad-based sources of demand growth and prosperity. But in their absence, policymakers can still treat the challenge of a market correction as an opportunity to think ahead about the nation’s industrial future.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Noon Energy just complete a successful demonstration of its reversible solid-oxide fuel cell.
Whatever you think of as the most important topic in energy right now — whether it’s electricity affordability, grid resilience, or deep decarbonization — long-duration energy storage will be essential to achieving it. While standard lithium-ion batteries are great for smoothing out the ups and downs of wind and solar generation over shorter periods, we’ll systems that can store energy for days or even weeks to bridge prolonged shifts and fluctuations in weather patterns.
That’s why Form Energy made such a big splash. In 2021, the startup announced its plans to commercialize a 100-plus-hour iron-air battery that charges and discharges by converting iron into rust and back again The company’s CEO, Mateo Jaramillo, told The Wall Street Journal at the time that this was the “kind of battery you need to fully retire thermal assets like coal and natural gas power plants.” Form went on to raise a $240 million Series D that same year, and is now deploying its very first commercial batteries in Minnesota.
But it’s not the only player in the rarified space of ultra-long-duration energy storage. While so far competitor Noon Energy has gotten less attention and less funding, it was also raising money four years ago — a more humble $3 million seed round, followed by a $28 million Series A in early 2023. Like Form, it’s targeting a price of $20 per kilowatt-hour for its electricity, often considered the threshold at which this type of storage becomes economically viable and materially valuable for the grid.
Last week, Noon announced that it had completed a successful demonstration of its 100-plus-hour carbon-oxygen battery, partially funded with a grant from the California Energy Commission, which charges by breaking down CO2 and discharges by recombining it using a technology known as a reversible solid-oxide fuel cell. The system has three main components: a power block that contains the fuel cell stack, a charge tank, and a discharge tank. During charging, clean electricity flows through the power block, converting carbon dioxide from the discharge tank into solid carbon that gets stored in the charge tank. During discharge, the system recombines stored carbon with oxygen from the air to generate electricity and reform carbon dioxide.
Importantly, Noon’s system is designed to scale up cost-effectively. That’s baked into its architecture, which separates the energy storage tanks from the power generating unit. That makes it simple to increase the total amount of electricity stored independent of the power output, i.e. the rate at which that energy is delivered.
Most other batteries, including lithium-ion and Form’s iron-air system, store energy inside the battery cells themselves. Those same cells also deliver power; thus, increasing the energy capacity of the system requires adding more battery cells, which increases power whether it’s needed or not. Because lithium-ion cells are costly, this makes scaling these systems for multi-day energy storage completely uneconomical.
In concept, Noon’s ability to independently scale energy capacity is “similar to pumped hydro storage or a flow battery,” Chris Graves, the startup’s CEO, told me. “But in our case, many times higher energy density than those — 50 times higher than a flow battery, even more so than pumped hydro.” It’s also significantly more energy dense than Form’s battery, he said, likely making it cheaper to ship and install (although the dirt cheap cost of Form’s materials could offset this advantage.)
Noon’s system would be the first grid-scale deployment of reversible solid-oxide fuel cells specifically for long-duration energy storage. While the technology is well understood, historically reversible fuel cells have struggled to operate consistently and reliably, suffering from low round trip efficiency — meaning that much of the energy used to charge the battery is lost before it’s used — and high overall costs. Graves conceded Noon has implemented a “really unique twist” on this tech that’s allowed it to overcome these barriers and move toward commercialization, but that was as much as he would reveal.
Last week’s demonstration, however, is a big step toward validating this approach. “They’re one of the first ones to get to this stage,” Alexander Hogeveen Rutter, a manager at the climate tech accelerator Third Derivative, told me. “There’s certainly many other companies that are working on a variance of this,” he said, referring to reversible fuel cell systems overall. But none have done this much to show that the technology can be viable for long-duration storage.
One of Noon’s initial target markets is — surprise, surprise — data centers, where Graves said its system will complement lithium-ion batteries. “Lithium ion is very good for peak hours and fast response times, and our system is complementary in that it handles the bulk of the energy capacity,” Graves explained, saying that Noon could provide up to 98% of a system’s total energy storage needs, with lithium-ion delivering shorter streams of high power.
Graves expects that initial commercial deployments — projected to come online as soon as next year — will be behind-the-meter, meaning data centers or other large loads will draw power directly from Noon’s batteries rather than the grid. That stands in contrast to Form’s approach, which is building projects in tandem with utilities such as Great River Energy in Minnesota and PG&E in California.
Hogeveen Rutter, of Third Derivative, called Noon’s strategy “super logical” given the lengthy grid interconnection queue as well as the recent order from the Federal Energy Regulatory Commission intended to make it easier for data centers to co-locate with power plants. Essentially, he told me, FERC demanded a loosening of the reins. “If you’re a data center or any large load, you can go build whatever you want, and if you just don’t connect to the grid, that’s fine,” Rutter said. “Just don’t bother us, and we won’t bother you.”
Building behind-the-meter also solves a key challenge for ultra-long-duration storage — the fact that in most regions, renewables comprise too small a share of the grid to make long-duration energy storage critical for the system’s resilience. Because fossil fuels still meet the majority of the U.S.’s electricity needs, grids can typically handle a few days without sun or wind. In a world where renewables play a larger role, long-duration storage would be critical to bridging those gaps — we’re just not there yet. But when a battery is paired with an off-grid wind or solar plant, that effectively creates a microgrid with 100% renewables penetration, providing a raison d’être for the long-duration storage system.
“Utility costs are going up often because of transmission and distribution costs — mainly distribution — and there’s a crossover point where it becomes cheaper to just tell the utility to go pound sand and build your power plant,” Richard Swanson, the founder of SunPower and an independent board observer at Noon, told me. Data centers in some geographies might have already reached that juncture. “So I think you’re simply going to see it slowly become cost effective to self generate bigger and bigger sizes in more and more applications and in more and more locations over time.”
As renewables penetration on the grid rises and long-duration storage becomes an increasing necessity, Swanson expects we’ll see more batteries like Noon’s getting grid connected, where they’ll help to increase the grid’s capacity factor without the need to build more poles and wires. “We’re really talking about something that’s going to happen over the next century,” he told me.
Noon’s initial demo has been operational for months, cycling for thousands of hours and achieving discharge durations of over 200 hours. The company is now fundraising for its Series B round, while a larger demo, already built and backed by another California Energy Commission grant, is set to come online soon.
While Graves would not reveal the size of the pilot that’s wrapping up now, this subsequent demo is set to deliver up to 100 kilowatts of power at once while storing 10 megawatt-hours of energy, enough to operate at full power for 100 hours. Noon’s full-scale commercial system is designed to deliver the same 100-hour discharge duration while increasing the power output to 300 kilowatts and the energy storage capacity to 30 megawatts.
This standard commercial-scale unit will be shipping container-sized, making it simple to add capacity by deploying additional modules. Noon says it already has a large customer pipeline, though these agreements have yet to be announced. Those deals should come to light soon though, as Swanson says this technology represents the “missing link” for achieving full decarbonization of the electricity sector.
Or as Hogeveen Rutter put it, “When people talk about, I’m gonna get rid of all my fossil fuels by 2030 or 2035 — like the United Kingdom and California — well this is what you need to do that.”
On aluminum smelting, Korean nuclear, and a geoengineering database
Current conditions: Winter Storm Fern may have caused up to $115 billion in economic losses and triggered the longest stretch of subzero temperatures in New York City’s history • Temperatures across the American South plunged up to 30 degrees Fahrenheit below historical averages • South Africa’s Northern Cape is roasting in temperatures as high as 104 degrees.

President Donald Trump has been on quite a shopping spree since taking an equity stake in MP Materials, the only active rare earths miner in the U.S., in a deal Heatmap’s Matthew Zeitlin noted made former Biden administration officials “jealous.” The latest stake the administration has taken for the American taxpayer is in USA Rare Earth, a would-be miner that has focused its attention establishing a domestic manufacturing base for the rare earth-based magnets China dominates. On Monday, the Department of Commerce announced a deal to inject $1.6 billion into the company in exchange for shares. “USA Rare Earth’s heavy critical minerals project is essential to restoring U.S. critical mineral independence,” Secretary of Commerce Howard Lutnick said in a statement. “This investment ensures our supply chains are resilient and no longer reliant on foreign nations.” In a call with analysts Monday, USA Rare Earth CEO Barbara Humpton called the deal “a watershed moment in our work to secure and grow a resilient and independent rare earth value chain based in this country.”
After two years of searching for a site to build the United States’ first new aluminum smelter in half a century, Century Aluminum has abandoned its original plan and opted instead to go into business with a Dubai-based rival developing a plant in Oklahoma. Emirates Global Aluminum announced plans last year to construct a smelter near Tulsa. Under the new plan, Century Aluminum would take a 40% stake in the venture, with Emirates Global Aluminum holding the other 60%. At peak capacity, the smelter would produce 750,000 tons of aluminum per year, a volume The Wall Street Journal noted would make it the largest smelter in the U.S. Emirates Global Aluminum has not yet announced a long-term contract to power the facility. Century Aluminum’s original plan was to use 100% of its power from renewables or nuclear, Canary Media reported, and received $500 million from the Biden administration to support the project.
The federal Mine Safety and Health Administration has stopped publishing data tied to inspections of sites with repeated violations, E&E News reported. At a hearing before the House Education & the Workforce Subcommittee on Workforce Protections last week, Wayne Palmer, the assistant secretary of labor for mine safety and health, said the data would no longer be made public. “To the best of my knowledge, we do not publish those under the current administration,” Palmer said. He said the decision to not make public results of “targeted inspections” predated his time at the agency. The move comes as the Trump administration is pushing to ramp up mining in the U.S. to compete with China’s near monopoly over key metals such as rare earths, and lithium. As Heatmap’s Katie Brigham wrote in September, “everybody wants to invest in critical minerals.”
Sign up to receive Heatmap AM in your inbox every morning:
South Korea’s center-left Democratic Party has historically been staunchly anti-nuclear. So when the country’s nuclear regulator licensed a new plant earlier this month — its first under a new Democratic president — I counted it as a win for the industry. Now President Lee Jae-myung’s administration is going all in all on atomic energy. On Monday, NucNet reported that the state-owned Korea Hydro & Nuclear Power plans to open bidding for sites for two new large reactors. The site selection is set to take up to six months. The country then plans to begin construction in the early 2030s and bring the reactors online in 2037 and 2038. Kim Sung-whan, the country’s climate minister, said the Lee administration would stick to the nuclear buildout plan authored in February 2025 under former President Yoon Suk Yeol, a right-wing leader who strongly supported the atomic power industry before being ousted from power after attempting to declare martial law.
Reflective, a nonprofit group that bills itself as “aiming to radically accelerate the pace of sunlight reflection research,” launched its Uncertainty Database on Monday, with the aim of providing scientists, funders, and policymakers with “an initial foundation to create a transparent, prioritized, stage-gated” roadmap of different technologies to spray aerosols in the atmosphere to artificially cool the planet. “SAI research is currently fragmented and underpowered, with no shared view of which uncertainties actually matter for real-world decisions,” Dakota Gruener, the chief executive of Reflective, said in a statement. “We need a shared, strategic view of what we know, what we don’t, and where research can make the biggest difference. The Uncertainty Database helps the field prioritize the uncertainties and research that matter most for informed decisions about SAI.” The database comes as the push to research geoengineering technologies goes mainstream. As Heatmap’s Robinson Meyer reported in October, Stardust Solutions, a U.S. firm run by former Israeli government physicists, has already raised $60 million in private capital to commercialize technology that many climate activists and scientists still see as taboo to even study.
Often we hear of the carbon-absorbing potential of towering forest trees or fast-growing algae. But nary a word on the humble shrub. New research out of China suggests the bush deserves another look. An experiment in planting shrubs along the edges of western China’s Taklamakan Desert over the past four decades has not only kept desertification at bay, it’s made a dent in carbon emissions from the area. “This is not a rainforest,” King-Fai Li, a physicist at the University of California at Riverside, said in a statement. “It’s a shrubland like Southern California’s chaparral. But the fact that it’s drawing down CO2 at all, and doing it consistently, is something positive we can measure and verify from space.” The study provides a rare, long-term case study of desert greening, since this effort has endured for decades whereas one launched in the Sahara Desert by the United Nations crumbled.
With historic lows projected for the next two weeks — and more snow potentially on the way — the big strain may be yet to come.
Winter Storm Fern made the final stand of its 2,300-mile arc across the United States on Monday as it finished dumping 17 inches of “light, fluffy” snow over parts of Maine. In its wake, the storm has left hundreds of thousands without power, killed more than a dozen people, and driven temperatures to historic lows.
The grid largely held up over the weekend, but the bigger challenge may still be to come. That’s because prolonged low temperatures are forecasted across much of the country this week and next, piling strain onto heating and electricity systems already operating at or close to their limits.
What issues there have been were largely due to damage in the transmission and distribution system, i.e. power lines freezing or being brought down by errant branches.
The outages or blackouts that have occurred have been the result of either operational issues with plants, scheduled maintenance, or issues specifically with snow affecting the distribution system. As yet there’s been no need for rolling blackouts to relieve grid congestion and preserve the system as a whole. Speaking about the country’s largest electrical grid, Jon Gordon, a director at Advanced Energy United, told Heatmap: “So far, so good.”
But this is all assuming we just get more cold weather. We could be in for another storm. Since late last week, the forecasting model maintained by the European Centre for Medium-Range Weather Forecasts — one of the two primary computer forecasting models, and generally considered more accurate than its analogue, the American model — has suggested there could be another major winter storm headed toward the Eastern U.S. next weekend. Whether it hits the Eastern Seaboard, clips it, or stays offshore, it’s still early to say with any confidence.
Should that storm hit, here’s what it’ll be barreling into.
Temperatures will likely remain below 0 degrees Fahrenheit across swaths of PJM Interconnection — the country’s largest regional transmission organization, covering the Mid-Atlantic through portions of the Midwest — with parts of Pennsylvania and Ohio not expected to see a day above freezing for the next two weeks.
Put simply, cold temperatures stress the grid. That’s because cold can affect the performance of electricity generators as well as the distribution and production of natural gas, the most commonly used grid fuel. And the longer the grid has to operate under these difficult conditions, the more fragile it gets. And this is all happening while demand for electricity and natural gas is rising.
Forced outages — which happen when power is pulled offline due to some kind of unexpected event or emergency — peaked on Sunday in PJM at just over 17,000 megawatts, while total outages were over 22 gigawatts on Monday, according to Grid Status’s Tim Ennis, who said some of them may have been due to ice “ice accumulation across Virginia.”
The market has also been serving more than its own 13-state territory. Already on Saturday — after the fierce cold had set in across its territory but before snow arrived — PJM noted to the Department of Energy that it had been asked to provide up to 3,000 megawatts to neighboring grids, and that it had already seen outages of around 20,000 megawatts — enough to serve 16 million people.
Kentucky, Virginia, and West Virginia reported the highest number of customers without power in the PJM region as of Monday afternoon, largely due to ice and snow that brought down tree branches on power lines or toppled utility poles.
Meanwhile, snow was still falling across New England on Monday afternoon, where parts of Massachusetts have received up to 20 inches. Another 8 inches could still accumulate on the Atlantic coast due to the ongoing lake effect, a common winter pattern in which cold Canadian air picks up moisture over the warmer Great Lakes, resulting in heavy snow downwind.
Though there were minimal blackouts in New England’s electricity market as of Monday morning, natural gas has fallen to just 30% of the grid’s fuel supply, from more than half at the same time a week earlier, with nearly 40% of its electricity output coming from oil-fired plants, Reuters reports. Solar generation peaked at less than a gigawatt on Sunday due to cloud cover, compared to over 4 gigawatts on Saturday and over 3 gigawatts on Friday. During the summer, ISO-NE’s combined behind-the-meter and utility-scale solar production can get as high as eight gigawatts.
The Department of Energy granted ISO New England, emergency permission to operate generators at maximum capacity, regardless of air quality and environmental standards. (It also granted the same dispensation to PJM and Texas’ grid operator, ERCOT.)
The most widespread outages in the country were concentrated in Tennessee, with some 230,000 customers in Nashville Electric Service’s area without power at one point. The disruptions were largely caused not by grid demands, but rather by nearly 100 broken utility poles and more than 70 distribution circuits taken down by the snow and ice, Utility Dive reported.
Mississippi and Louisiana also had outages, with around 4% of Energy customers offline according to Jefferies data, and around 10% of Entergy customers in Mississippi being affected by blackouts. By contrast, Jefferies data shows, less than 1% of Texas electricity customers were offline.
Typically, cold weather means higher natural gas prices, as the demand for home heating goes up alongside demand for electricity. The 44.2 billion cubic feet of natural gas forecast to be burned today would be the fifth highest January burn of all time in the U.S., according to Matthew Palmer, executive director at S&P Global Energy, in an email. The extended cold weather is expected to push natural gas stockpiles to their lowest since the winter of 2021 to 2022, according to S&P data.
Benchmark natural gas prices have shot up to $6.50 per million British thermal units, up from $5.28 on Friday. Crude oil prices by contrast were down slightly today, while heating oil prices were up around 5%.
High natural prices means that power markets are also expecting higher prices. Day-ahead average wholesale prices in Texas for 9 a.m. were almost $1,500 per megawatt-hour, compared to just $100 in the real-time market. In PJM, average real-time prices were around $270 at 9 a.m. compared to $482 in the day-ahead market.
“The worst is over, but we are expecting bitterly cold temperatures throughout the week. Please continue to avoid unnecessary travel and be vigilant about ice.” New Jersey Governor Mikie Sherrill, who had made electricity prices the centerpoint of her election campaign as well as her early days in office, said in a statement.
“While the worst of the snow is over, prolonged cold is still expected,” Jefferies analyst Julien Dumoulin-Smith wrote in a note to clients Monday. That can lead to “resource adequacy events,” i.e. blackouts, “as fuel supplies get strained and plants face operational strains from more significant run-time.”
There’s particular pressure and attention during this cold snap on ERCOT, the Texas grid operator, after 2021’s Winter Storm Uri, which brought ice, snow, and below-0 temperatures to much of the state. Natural gas wellheads froze up as much of the system for pumping and distributing natural gas lost power. Power plants were “unprepared for cold weather,” a report from the Federal Energy Regulatory Commission found, “and thus failed in large numbers.” ERCOT had to order power plants to shut down for several days in order to protect the system as a whole from falling perilously out of frequency, which would have risked a complete blackout. Around 60% of the state’s households rely on electricity for heating, and the long freeze-out left 4 million homes and businesses without power. More than 200 people died.
In the intervening years, Texas has introduced new capacity and reforms meant to prevent a similar tragedy. While ERCOT “does not anticipate any reliability issues on the statewide electric grid,” per a spokesperson, the operator flagged for the DOE that low temperatures in the week ahead could raise demand to an “extreme level” that poses “significant risk of emergency conditions that could jeopardize electric reliability and public safety.” So far, though, it’s been holding up, with peak demand expected Monday morning and outages mostly limited to East Texas due to downed power lines.
The Tennessee Valley Authority, which operates a vertically integrated grid centered in Tennessee and spanning several neighboring states, warned of “extreme cold” in the coming days, but said that its generation fleet — which includes coal, natural gas, and nuclear power plants — was “positioned to meet rising demand.” As of Monday morning, TVA said that 12 of the 153 power companies it serves had “distribution issues” related to the storm.
One Mississippi power company in the TVA system said that it had “suffered catastrophic damage” to its distribution system, specifically a 161 kilovolt transmission line operated by the TVA. The cold weather has dealt a double blow to the system, with TVA officials reporting ice on transmission and distribution lines as well as icy conditions making it difficult to service lines in need of repair.
Currently, TVA is forecasting that demand will peak Tuesday at just over 33,000 megawatts, according to EIA data. The system’s all-time winter peak is 35,430 megawatts.
PJM also expects several more days of tight conditions on the grid thanks to forecasted cold weather. The grid operator issued a “maximum generation emergency/load management alert” on Monday morning through at least the end of the day Tuesday, indicating that it needed to maintain high levels of generation throughout the system. It also asked generators for specifics on when any scheduled maintenance would be over in order to more carefully schedule operations to maintain reliability.
Over the weekend, PJM told the Energy Department that peak demand could exceed 130,000 megawatts “for seven straight days, a winter streak that PJM has never experienced.” The grid operator expects project peak demand over 147,000 megawatts on Tuesday, exceeding the previous record of 143,700 megawatts set last January. Demand peaked at 135,000 megawatts on Saturday and 129,000 megawatts on Sunday.