You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
If it turns out to be a bubble, billions of dollars of energy assets will be on the line.

The data center investment boom has already transformed the American economy. It is now poised to transform the American energy system.
Hyperscalers — including tech giants such as Microsoft and Meta, as well as leaders in artificial intelligence like OpenAI and CoreWeave — are investing eyewatering amounts of capital into developing new energy resources to feed their power-hungry data infrastructure. Those data centers are already straining the existing energy grid, prompting widespread political anxiety over an energy supply crisis and a ratepayer affordability shock. Nothing in recent memory has thrown policymakers’ decades-long underinvestment in the health of our energy grid into such stark relief. The commercial potential of next-generation energy technologies such as advanced nuclear, batteries, and grid-enhancing applications now hinge on the speed and scale of the AI buildout.
But what happens if the AI boom buffers and data center investment collapses? It is not idle speculation to say that the AI boom rests on unstable financial foundations. Worse, however, is the fact that as of this year, the tech sector’s breakneck investment into data centers is the only tailwind to U.S. economic growth. If there is a market correction, there is no other growth sector that could pick up the slack.
Not only would a sudden reversal in investor sentiment make stranded assets of the data centers themselves, which will lose value as their lease revenue disappears, it also threatens to strand all the energy projects and efficiency innovations that data center demand might have called forth.
If the AI boom does not deliver, we need a backup plan for energy policy.
An analysis of the capital structure of the AI boom suggests that policymakers should be more concerned about the financial fundamentals of data centers and their tenants — the tech companies that are buoying the economy. My recent report for the Center for Public Enterprise, Bubble or Nothing, maps out how the various market actors in the AI sector interact, connecting the market structure of the AI inference sector to the economics of Nvidia’s graphics processing units, the chips known as GPUs that power AI software, to the data center real estate debt market. Spelling out the core financial relationships illuminates where the vulnerabilities lie.

First and foremost: The business model remains unprofitable. The leading AI companies ― mostly the leading tech companies, as well as some AI-specific firms such as OpenAI and Anthropic ― are all competing with each other to dominate the market for AI inference services such as large language models. None of them is returning a profit on its investments. Back-of-the-envelope math suggests that Meta, Google, Microsoft, and Amazon invested over $560 billion into AI technology and data centers through 2024 and 2025, and have reported revenues of just $35 billion.
To be sure, many new technology companies remain unprofitable for years ― including now-ubiquitous firms like Uber and Amazon. Profits are not the AI sector’s immediate goal; the sector’s high valuations reflect investors’ assumptions about future earnings potential. But while the losses pile up, the market leaders are all vying to maximize the market share of their virtually identical services ― a prisoner’s dilemma of sorts that forces down prices even as the cost of providing inference services continues to rise. Rising costs, suppressed revenues, and fuzzy measurements of real user demand are, when combined, a toxic cocktail and a reflection of the sector’s inherent uncertainty.
Second: AI companies have a capital investment problem. These are not pure software companies; to provide their inference services, AI companies must all invest in or find ways to access GPUs. In mature industries, capital assets have predictable valuations that their owners can borrow against and use as collateral to invest further in their businesses. Not here: The market value of a GPU is incredibly uncertain and, at least currently, remains suppressed due to the sector’s competitive market structure, the physical deterioration of GPUs at high utilization rates, the unclear trajectory of demand, and the value destruction that comes from Nvidia’s now-yearly release of new high-end GPU models.
The tech industry’s rush to invest in new GPUs means existing GPUs lose market value much faster. Some companies, particularly the vulnerable and debt-saddled “neocloud” companies that buy GPUs to rent their compute capacity to retail and hyperscaler consumers, are taking out tens of billions of dollars of loans to buy new GPUs backed by the value of their older GPU stock; the danger of this strategy is obvious. Others including OpenAI and xAI, having realized that GPUs are not safe to hold on one’s balance sheet, are instead renting them from Oracle and Nvidia, respectively.
To paper over the valuation uncertainty of the GPUs they do own, all the hyperscalers have changed their accounting standards for GPU valuations over the past few years to minimize their annual reported depreciation expenses. Some financial analysts don’t buy it: Last year, Barclays analysts judged GPU depreciation as risky enough to merit marking down the earnings estimates of Google (in this case its parent company, Alphabet), Microsoft, and Meta as much as 10%, arguing that consensus modeling was severely underestimating the earnings write-offs required.
Under these market dynamics, the booming demand for high-end chips looks less like a reflection of healthy growth for the tech sector and more like a scramble for high-value collateral to maintain market position among a set of firms with limited product differentiation. If high demand projections for AI technologies come true, collateral ostensibly depreciates at a manageable pace as older GPUs retain their marketable value over their useful life — but otherwise, this combination of structurally compressed profits and rapidly depreciating collateral is evidence of a snake eating its own tail.
All of these hyperscalers are tenants within data centers. Their lack of cash flow or good collateral should have their landlords worried about “tenant churn,” given the risk that many data center tenants will have to undertake multiple cycles of expensive capital expenditure on GPUs and network infrastructure within a single lease term. Data center developers take out construction (or “mini-perm”) loans of four to six years and refinance them into longer-term permanent loans, which can then be packaged into asset-backed and commercial mortgage-backed securities to sell to a wider pool of institutional investors and banks. The threat of broken leases and tenant vacancies threatens the long-term solvency of the leading data center developers ― companies like Equinix and Digital Realty ― as well as the livelihoods of the construction contractors and electricians they hire to build their facilities and manage their energy resources.
Much ink has already been spilled on how the hyperscalers are “roundabouting” each other, or engaging in circular financing: They are making billions of dollars of long-term purchase commitments, equity investments, and project co-development agreements with one another. OpenAI, Oracle, CoreWeave, and Nvidia are at the center of this web. Nvidia has invested $100 billion in OpenAI, to be repaid over time through OpenAI’s lease of Nvidia GPUs. Oracle is spending $40 billion on Nvidia GPUs to power a data center it has leased for 15 years to support OpenAI, for which OpenAI is paying Oracle $300 billion over the next five years. OpenAI is paying CoreWeave over the next five years to rent its Nvidia GPUs; the contract is valued at $11.9 billion, and OpenAI has committed to spending at least $4 billion through April 2029. OpenAI already has a $350 million equity stake in CoreWeave. Nvidia has committed to buying CoreWeave’s unsold cloud computing capacity by 2032 for $6.3 billion, after it already took a 7% stake in CoreWeave when the latter went public. If you’re feeling dizzy, count yourself lucky: These deals represent only a fraction of the available examples of circular financing.
These companies are all betting on each others’ growth; their growth projections and purchase commitments are all dependent on their peers’ growth projections and purchase commitments. Optimistically, this roundabouting represents a kind of “risk mutualism,” which, at least for now, ends up supporting greater capital expenditures. Pessimistically, roundabouting is a way for these companies to pay each other for goods and services in any way except cash — shares, warrants, purchase commitments, token reservations, backstop commitments, and accounts receivable, but not U.S. dollars. The second any one of these companies decides it wants cash rather than a commitment is when the music stops. Chances are, that company needs cash to pay a commitment of its own, likely involving a lender.
Lenders are the final piece of the puzzle. Contrary to the notion that cash-rich hyperscalers can finance their own data center buildout, there has been a record volume of debt issuance this year from companies such as Oracle and CoreWeave, as well as private credit giants like Blue Owl and Apollo, which are lending into the boom. The debt may not go directly onto hyperscalers’ balance sheets, but their purchase commitments are the collateral against which data center developers, neocloud companies like CoreWeave, and private credit firms raise capital. While debt is not inherently something to shy away from ― it’s how infrastructure gets built ― it’s worth raising eyebrows at the role private credit firms are playing at the center of this revenue-free investment boom. They are exposed to GPU financing and to data center financing, although not the GPU producers themselves. They have capped upside and unlimited downside. If they stop lending, the rest of the sector’s risks look a lot more risky.

A market correction starts when any one of the AI companies can’t scrounge up the cash to meet its liabilities and can no longer keep borrowing money to delay paying for its leases and its debts. A sudden stop in lending to any of these companies would be a big deal ― it would force AI companies to sell their assets, particularly GPUs, into a potentially adverse market in order to meet refinancing deadlines. A fire sale of GPUs hurts not just the long-term earnings potential of the AI companies themselves, but also producers such as Nvidia and AMD, since even they would be selling their GPUs into a soft market.
For the tech industry, the likely outcome of a market correction is consolidation. Any widespread defaults among AI-related businesses and special purpose vehicles will leave capital assets like GPUs and energy technologies like supercapacitors stranded, losing their market value in the absence of demand ― the perfect targets for a rollup. Indeed, it stands to reason that the tech giants’ dominance over the cloud and web services sectors, not to mention advertising, will allow them to continue leading the market. They can regain monopolistic control over the remaining consumer demand in the AI services sector; their access to more certain cash flows eases their leverage constraints over the longer term as the economy recovers.
A market correction, then, is hardly the end of the tech industry ― but it still leaves a lot of data center investments stranded. What does that mean for the energy buildout that data centers are directly and indirectly financing?
A market correction would likely compel vertically integrated utilities to cancel plans to develop new combined-cycle gas turbines and expensive clean firm resources such as nuclear energy. Developers on wholesale markets have it worse: It’s not clear how new and expensive firm resources compete if demand shrinks. Grid managers would have to call up more expensive units less frequently. Doing so would constrain the revenue-generating potential of those generators relative to the resources that can meet marginal load more cheaply — namely solar, storage, peaker gas, and demand-response systems. Combined-cycle gas turbines co-located with data centers might be stranded; at the very least, they wouldn’t be used very often. (Peaker gas plants, used to manage load fluctuation, might still get built over the medium term.) And the flight to quality and flexibility would consign coal power back to its own ash heaps. Ultimately, a market correction does not change the broader trend toward electrification.
A market correction that stabilizes the data center investment trajectory would make it easier for utilities to conduct integrated resource planning. But it would not necessarily simplify grid planners’ ability to plan their interconnection queues — phantom projects dropping out of the queue requires grid planners to redo all their studies. Regardless of the health of the investment boom, we still need to reform our grid interconnection processes.
The biggest risk is that ratepayers will be on the hook for assets that sit underutilized in the absence of tech companies’ large load requirements, especially those served by utilities that might be building power in advance of committed contracts with large load customers like data center developers. The energy assets they build might remain useful for grid stability and could still participate in capacity markets. But generation assets built close to data center sites to serve those sites cheaply might not be able to provision the broader energy grid cost-efficiently due to higher grid transport costs incurred when serving more distant sources of load.
These energy projects need not be albatrosses.
Many of these data centers being planned are in the process of securing permits and grid interconnection rights. Those interconnection rights are scarce and valuable; if a data center gets stranded, policymakers should consider purchasing those rights and incentivizing new businesses or manufacturing industries to build on that land and take advantage of those rights. Doing so would provide offtake for nearby energy assets and avoid displacing their costs onto other ratepayers. That being said, new users of that land may not be able to pay anywhere near as much as hyperscalers could for interconnection or for power. Policymakers seeking to capture value from stranded interconnection points must ensure that new projects pencil out at a lower price point.
Policymakers should also consider backstopping the development of critical and innovative energy projects and the firms contracted to build them. I mean this in the most expansive way possible: Policymakers should not just backstop the completion of the solar and storage assets built to serve new load, but also provide exigent purchase guarantees to the firms that are prototyping the flow batteries, supercapacitors, cooling systems, and uninterruptible power systems that data center developers are increasingly interested in. Without these interventions, a market correction would otherwise destroy the value of many of those projects and the earnings potential of their developers, to say nothing of arresting progress on incredibly promising and commercializable technologies.
Policymakers can capture long-term value for the taxpayer by making investments in these distressed projects and developers. This is already what the New York Power Authority has done by taking ownership and backstopping the development of over 7 gigawatts of energy projects ― most of which were at risk of being abandoned by a private sponsor.
The market might not immediately welcome risky bets like these. It is unclear, for instance, what industries could use the interconnection or energy provided to a stranded gigawatt-scale data center. Some of the more promising options ― take aluminum or green steel ― do not have a viable domestic market. Policy uncertainty, tariffs, and tax credit changes in the One Big Beautiful Bill Act have all suppressed the growth of clean manufacturing and metals refining industries like these. The rest of the economy is also deteriorating. The fact that the data center boom is threatened by, at its core, a lack of consumer demand and the resulting unstable investment pathways is itself an ironic miniature of the U.S. economy as a whole.
As analysts at Employ America put it, “The losses in a [tech sector] bust will simply be too large and swift to be neatly offset by an imminent and symmetric boom elsewhere. Even as housing and consumer durables ultimately did well following the bust of the 90s tech boom, there was a one- to two-year lag, as it took time for long-term rates to fall and investors to shift their focus.” This is the issue with having only one growth sector in the economy. And without a more holistic industrial policy, we cannot spur any others.
Questions like these ― questions about what comes next ― suggest that the messy details of data center project finance should not be the sole purview of investors. After all, our exposure to the sector only grows more concentrated by the day. More precisely mapping out how capital flows through the sector should help financial policymakers and industrial policy thinkers understand the risks of a market correction. Political leaders should be prepared to tackle the downside distributional challenges raised by the instability of this data center boom ― challenges to consumer wealth, public budgets, and our energy system.
This sparkling sector is no replacement for industrial policy and macroeconomic investment conditions that create broad-based sources of demand growth and prosperity. But in their absence, policymakers can still treat the challenge of a market correction as an opportunity to think ahead about the nation’s industrial future.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Giving up on hourly matching by 2030 doesn’t mean giving up on climate ambition — necessarily.
Microsoft celebrated a “milestone achievement” earlier this year, when it announced that it had successfully matched 100% of its 2025 electricity usage with renewable energy. This past week, however, Bloomberg reported that the company was considering delaying or abandoning its next clean energy target set for 2030.
What comes after achieving 100% renewable energy, you might ask? What Microsoft did in 2025 was tally its annual energy consumption and purchase an equal amount of solar and wind power. By 2030, the company aspired to match every kilowatt it consumes with carbon-free electricity hour by hour. That means finding clean power for all the hours when the sun isn’t shining and the wind isn’t blowing.
The news that Microsoft is revisiting this goal could be read as the beginning of the end of corporate climate ambition. Microsoft has long been a pioneer on that front, setting increasingly difficult goals and then doing the groundwork to help others follow in its footsteps. Now it appears to be accepting defeat. The news comes just weeks after my colleague Robinson Meyer broke the news that the company is also pausing its industry-leading carbon removal purchasing program.
Delaying or abandoning the clean energy target — the two options presented in the Bloomberg story — represent quite different scenarios, however.
“There’s going to be a big difference between them saying, We’re going to keep trying as hard as we can to go as far as we can, but acknowledge we may not hit it, versus saying, Well, we can’t hit this extremely ambitious goal we set for ourselves, therefore we’re just giving up on the overall mission,” Wilson Ricks, a manager in Clean Air Task Force’s electricity program, told me.
The goal was always going to be difficult, if not impossible, for Microsoft to hit, Ricks said. Yes, it’s gotten tougher as Microsoft’s electricity usage has surged with the rise of artificial intelligence, and because Congress killed subsidies for clean energy as the Trump administration has done its best to stall wind and solar development. But some of the technologies likely needed to achieve the goal, such as advanced nuclear and geothermal power plants, have yet to achieve commercial deployment, let alone reach meaningful scale, and probably won’t by 2030 — especially not across all the regions that Microsoft operates in.
Nonetheless, some clean energy advocates (including Ricks) argue that keeping hourly matching as a north star is paramount because it helps put the world on the path to fully decarbonized electric grids.
Google was the first to introduce a 24/7 carbon-free energy strategy in 2020, and for a moment, it seemed that the rest of the corporate world would follow. A handful of companies joined a coalition to support the goal, but to date, I’m aware of just two — Microsoft and the data storage company Iron Mountain — that have followed Google in committing to achieving it.
Most companies approach their clean energy claims with considerably less precision. The norm is to purchase “unbundled” renewable energy certificates, tradeable vouchers that say a certain amount of renewable energy has been generated somewhere, at some point, and that the certificate owner can lay claim to it. Many simply buy enough of these RECs to cover their annual electricity usage and call themselves “powered by 100% renewable energy.”
There’s a spectrum of quality in the RECs available for purchase, but the market is flooded with cheap, relatively meaningless certificates. A company that operates in a coal-heavy region like Indiana can buy RECs from a wind farm in Texas that was built a decade ago, which won’t do anything to change the makeup of the grid in either place.
Today, the gold standard for companies with capital to throw around is instead to seek out long-term contracts directly with wind and solar developers known as power purchase agreements. That doesn’t mean the wind and solar farms send power to the companies directly. But these types of contracts are more likely to bring new projects onto the grid by providing guaranteed future revenues, helping developers secure the financing they need to build.
Microsoft started buying unbundled RECs more than a decade ago, and in 2014, it reported it had matched all of its global electricity usage. In 2016, the company began setting goals for direct procurement of renewable energy. In 2020, it pledged to achieve 100% renewable this way by 2025 — but it wasn’t going to sign just any wind or solar agreements. It aimed to pursue contracts with projects that were in the same regions as the company’s operations and that wouldn’t have been built without the company’s support. “Where and how you buy matters,” it wrote in its 2020 sustainability report. “The closer the new wind or solar farm is to your data center, the more likely it is those zero carbon electrons are powering it.”
In 2021, Microsoft upped the ante again by establishing its 2030 hourly matching target, which it referred to as “100/100/0” — 100% of electrons, 100% of the time, zero-carbon energy.
Microsoft has never publicly reported its progress toward the 2030 goal. The company’s enthusiasm for the target has also appeared to wane. In 2020, before Microsoft even made the 100/100/0 commitment, it touted a solution it developed to track and match renewable energy generation and consumption on an hourly basis. In the years since, it has led its peers in investments in round-the-clock nuclear power, even signing a 20-year power purchase agreement with Constellation Energy to bring the shuttered Three Mile Island nuclear plant in Pennsylvania back online.
But Microsoft has stopped publicizing the goal in blog posts and press releases. It went unmentioned in the recent announcement about the 2025 renewable energy achievement, for instance. And a section in the company’s annual sustainability report listing its climate targets that had previously advertised the 2030 goal as “Replacing with 100/100/0 carbon-free energy” was re-written in 2025 as “Expanding carbon-free electricity,” fuzzier rhetoric that now reads as a harbinger of a softer approach.
Microsoft did not respond to questions about its progress toward the 2030 target. In an emailed statement, a spokesperson emphasized the company’s commitment to maintaining its annual matching goal — the one achieved in 2025. No doubt that will take a lot more investment in the years to come now that the company is gobbling up a lot more electricity for data centers — some of it directly from natural gas plants.
Microsoft also shared a statement from Melanie Nakagawa, Microsoft’s chief sustainability officer, emphasizing the company’s commitment to become carbon negative. “At times we may make adjustments to our approach toward our sustainability goals,” she said. “Any adjustments we make are part of our disciplined approach—not a change in our long-term ambition.”
Even if Microsoft axes its hourly matching target, the company might have to start reporting its clean electricity usage on an hourly basis anyway. The Greenhouse Gas Protocol, a nonprofit that sets standards for how companies should calculate their emissions, is currently considering adopting an hourly accounting requirement. While the protocol’s standards are voluntary, companies almost uniformly follow them, and they will soon become mandatory in much of the world, as governments in California and Europe plan to integrate them into corporate disclosure rules.
The accounting rule change is highly controversial, with many companies arguing that it will deter them from investing in clean energy altogether, since their purchases won’t look as good on paper. “I don’t think anybody is debating having rules and guidelines around how you do more narrow matching, we should have that,” Michael Leggett, the co-founder and chief product officer for Ever.Green, a company that sells high-impact RECs, told me. “I think the debate has largely been around, is that required?”
Leggett said he could see how Microsoft’s pullback could be twisted to support either side. Proponents of the hourly accounting method will say, “Aha! See? This is why we have to require it.” Opponents will say, “See, even Microsoft can’t do it, so how are you going to require all these other companies to do it?”
I spoke to Alex Piper, the head of U.S. policy and markets at EnergyTag, a nonprofit that advocates for reforms to enable 24/7 clean energy, who saw the news as vindicating.
“What we’re seeing right now is many of the hyperscale technology companies look to the fastest path to power, and whether it is or not, some of them are turning to gas as that solution,” he told me. Piper argued that companies are choosing natural gas in part because they can get away with clean energy claims under the protocol’s existing rules. “The proposed rules for the greenhouse gas protocol would require those companies to at least be transparent.”
But Microsoft walking back its hourly matching goal does not have to mean that it’s walking back its climate ambition. It’s possible for companies to achieve significant emissions reductions by focusing their clean energy purchases on the places where wind and solar will do the most to displace fossil fuels, rather than worrying about matching every hour. For a company that operates in California, for example, supporting the addition of solar power to a coal-heavy grid — even if it’s in a different part of the country or the world — will do more, faster, than helping to build solar locally or waiting for around-the-clock resources such as geothermal power to come online.
Critics of hourly accounting argue that it doesn’t give companies credit for this kind of approach. “What I would love to have happen is anything to incentivize, recognize, and reward companies signing 20-year contracts that enable new projects coming online,” Leggett said of the Greenhouse Gas Protocol’s forthcoming rule change.
Ricks, of Clean Air Task Force, rejects the idea that an hourly accounting requirement would deter these kinds of deals. “That doesn’t mean that they can’t report any other set of numbers they want to,” he said. “Many companies do report things that aren’t currently recognized in the Greenhouse Gas Protocol.”
Microsoft is a prime example. The company includes two measures of its renewable energy usage in its annual reports: “percentage of renewable electricity,” which includes the unbundled RECs Microsoft has continued to buy over the years, and “percentage of direct renewable electricity,” which tracks power purchase agreements and the renewable portion of the grid mix where its facilities are located. The former uses the Greenhouse Gas protocol’s current accounting method, under which Microsoft says it has hit 100% every year since 2014. But the latter is the company’s own bespoke calculation.
The company’s 2025 feat was based on this made-up methodology, and it represents the first time Microsoft has announced to the world that it used 100% renewable energy. It never previously made such claims about its REC purchases, as far as I can tell. In other words, Microsoft’s standards for what it publicizes are far more rigorous than what the Greenhouse Gas Protocol requires.
Regardless of what the protocol decides, it will determine only what companies must report. It won’t prevent them from offering up their own, additional metrics of success.
PJM Interconnection has some ideas, as does the state of New Jersey.
We’ve already talked this week about Pennsylvania asking whether the modern “regulatory compact,” which grants utilities monopoly geographical franchises and regulated returns from their capital investments, is still suitable in this era of rising prices and data-center-driven load growth.
Now America’s biggest electricity market and another one of that market’s biggest states are considering far-reaching, fundamental reforms that could alter how electricity infrastructure is planned and paid for over 65 million Americans.
New Jersey Governor Mikie Sherrill anchored her 2025 campaign on electricity prices, and for good reason — in the past four years, electricity prices in the state have gone up 48%, according to Heatmap and MIT’s Electricity Price Hub, while average bills have risen from $83 per month to $130. On her first day in office, Sherrill issued two executive orders acting on that promise, directing the state to make funds available to freeze rates and declaring a state of emergency to ease the way to building more generation.
Included in that first order was a review of utility business models to be carried out by state regulators. What that review will entail is now coming into focus.
On Wednesday, the New Jersey Board of Public Utilities issued a statement announcing that it will look specifically at “whether New Jersey’s century-old utility business model — one that rewards electric distribution companies (EDCs) for capital spending even when cheaper alternatives exist — should be replaced with a framework tied to performance, affordability, and long-term cost stability.” In case anyone was still ambiguous as to what the outcome of said study might be, the board added that it is “expected to drive the most significant restructuring of utility regulation in New Jersey in decades.”
The current system, the board’s president Christine Guhl-Savoy said at a hearing Thursday, “creates a structural incentive to favor capital intensive solutions, even when lower costs, non-wires or demand side alternatives may be available.”
This structure, she said, could help explain why “over the past decade, electric delivery charges in New Jersey have risen steadily.” Within the service territory of PSEG, one of the four major New Jersey utilities, distribution charges alone have risen from $19.24 per month in January 2020 (as far back as the Heatmap-MIT data goes) to $21.84 as of April, while transmission charges have risen from around $20 to just over $29 per month. Many critics of the utility business model point to high levels of local grid spending on distribution as a way that utilities pad their earnings with returns harvested from ratepayers.
In the system regulators explored at the hearing, new projects would get a more skeptical look and ratepayers payouts would be partially determined by utilities hitting pre-defined service goals. NJBPU executive director Bob Brabston also indicated that the review process would take a close look at utilities’ regulated returns on equity — echoing his neighbor across the Delaware River, Pennsylvania Governor Josh Shapiro, who wrote in a letter to his state’s utilities earlier this week that these returns must be “transparent” and “justifiable,” and no longer be based on “educated guesses.”
“We want to make sure that the actual cost of equity and the returns on equity are close,” Brabston said Thursday. “We don’t want there to be a significant gap between the cost of equity that you all experience and the returns that the agencies that the agency awards.”
Meanwhile, in Valley Forge, Pennsylvania, the framework within which New Jersey’s utilities exist is coming in for its own examination.
PJM Interconnection — the nation’s largest electricity market, which covers not just Pennsylvania and New Jersey but also part or all of 11 other states — released an almost 70-page paper Wednesday, in which the organization’s president David Mills wrote that “the current situation is not tenable.”
PJM has been the poster child for a host of issues plaguing the electricity markets across the country, including fast-rising prices, a failure to quickly bring on new generation, and an inability to assure the market’s preferred level of reserve reliability. This set of challenges, Mills said in the paper’s introduction, “reflects something more fundamental than a design that needs recalibration.” Instead, PJM must consider “whether the foundational assumptions of the market remain valid – and if not, what a valid set of assumptions would require.”
The problem with the electricity market, he argued, can be solved by more markets. Right now, when prices shoot up, governments intervene with price caps, suppressing the market signal necessary to bring on sufficient generation that would bring down prices.
To replace that system, the paper proposes three possible models. The first, which it calls “Stabilized Markets,” would allow capacity to be procured for several years at a time outside of the current auction system, so that utilities could make sure their basic needs were covered before they go into the annual auctions. This would provide long term security for new investment.
The second path would be a more fundamental reform. This “Differential Reliability” approach would do away with the “shared reliability compact,” under which all loads must be served by the system at all times. Instead, PJM would “develop the operational and commercial framework to explicitly differentiate reliability,” incentivizing approaches like bring your own generation or curtailing power for new large sources of demand.
The third path is an “Energy Market Transition,” which might also be called the “Texas option.” Following this path, the capacity market would shrink as a portion of revenues earned by generators, and more revenue would come from real-time or near-real-time electricity sales.
While this path isn’t “full Texas” (ERCOT doesn’t have a capacity market at all), it would mean allowing for higher prices for energy in real-time, a.k.a. “scarcity pricing” which is arguably the defining feature of the ERCOT system (though even that was scaled back when prices got too high).
“The choices embedded in these paths involve genuine trade-offs, and those trade-offs affect different stakeholders uniquely,” the paper says.If PJM has learned anything in the past few years, it’s that it doesn’t get to make decisions on its own. Those stakeholders will get their say, one way or another.
Big fundraises for Nyobolt and Skeleton Technologies, plus more of the week’s biggest money moves.
Following a quiet week for new deals, the industry is back at it with a bunch of capital flowing into some of the industry’s most active areas. My colleague Alexander C. Kaufman already told you about one of the more buzzworthy announcements from data center-land in Wednesday’s AM newsletter: Wave energy startup Panthalassa raised $140 million in a round led by Peter Thiel to “perform AI inference computing at sea” using nodes powered by the ocean’s waves.
This week also saw fresh funding for more conventional data center infrastructure, as Nyobolt and Skeleton Technologies both announced later-stage rounds for data center backup power solutions. Meanwhile, it turns out Redwood Materials is not the only company bringing in significant capital for second-life EV battery systems — Moment Energy just raised $40 million to pursue a similar approach. Elsewhere, investors backed an effort to rebuild domestic magnesium production, and, in a glimmer of hope for a sector on the outs, gave a boost to green cement startup Terra CO2.
Cambridge-based startup Nyobolt has become the latest battery company to reach a $1 billion valuation, with its expansion into the data center market helping fuel excitement around its tech. Spun out of University of Cambridge research in 2019, the company develops ultra-fast-charging batteries based on a modified lithium-ion chemistry. Its core innovation is an anode made from niobium tungsten oxide, which Nyobolt says enables its batteries to charge to 80% in less than five minutes, with a cycle life that’s 10 times longer than conventional lithium-ion, all without the risk of fire.
The company has now raised a $60 Series C, following what it describes as a period of “rapid commercial momentum,” with revenue increasing five-fold year-over-year as customers in the robotics and data center industry piled in. Symbotic, an autonomous robotics company and existing customer, led the latest round. While Symbotic previously relied on supercapacitors to power its robots, Nyobolt’s says its batteries provide six times more energy capacity in a lighter package, allowing its warehouse robots to work for retailers like Walgreens, Target, and Kroger around the clock.
Now the startup is targeting data center customers too, positioning its tech as a fast-acting fix for the sudden power surges common to large-scale artificial intelligence workloads, as well as a temporary backup power solution for outages. While it has no confirmed domestic data center customers to date, it does have a nonbinding agreement with the Indian state of Rajasthan to deploy over 100 megawatts of off-grid AI data center and power management infrastructure, part of a broader push to expand its presence across the country.
Notably, the press release made no mention of plans to sell its tech to electric vehicle automakers, though this appears to have been a central focus previously. As recently as last summer, executive vice president Ramesh Narasimhan told the BBC that he hoped Nyobolt’s batteries would “transform the experience of owning an EV.” But while its tech does enable extremely fast charging, its underlying chemistry is not optimized for long-range driving. A sports car built to test the company’s batteries had just a 155 mile range. So like many of its climate tech peers, the company appears to be betting that data centers now represent a more reliable opportunity.
This week brought additional news from another European player aiming to smooth out data center power surges. Estonia-based supercapacitor startup Skeleton Technologies raised $39 million in what it describes as the first close of a pre-IPO funding round, with a U.S. listing planned for next year. Its core tech is built around a “curved graphene” structure, which the company likens to a crumpled sheet of paper with a high surface area. The graphene’s many exposed surfaces and edges allows it to hold more electric charge, which Skeleton says delivers a 72% improvement in energy density.
Like Nyobolt, Skeleton says its tech offers faster response times and longer cycle life. But supercapacitors are a fundamentally different technology than Nyobolt’s modified lithium-ion solution. Though they offer near-instantaneous response times, they store very little energy — just enough to smooth out microsecond power spikes in GPU workloads. Nyobolt’s batteries, by contrast, aim not only to smooth out data center power spikes, but also to deliver about 90 seconds of backup power in the case of an outage, before a generator or other backup source kicks in.
Skeleton is already mass-producing supercapacitors in Germany and delivering to unnamed “major U.S. hyperscalers for AI infrastructure.” It’s also making moves to expand its U.S. footprint ahead of its pending IPO, opening an engineering facility in Houston and aiming to begin domestic manufacturing of AI data center solutions in the first half of this year.
Last year brought a wave of new climate tech coalitions, with one of the most ambitious efforts known as the All Aboard Coalition. This group of venture firms is targeting the investment gap known as the missing middle, which falls between early-stage venture rounds and infrastructure funding. The model is relatively mechanical: When three or more member firms participate in a later-stage round for a company, the coalition automatically coinvests out of its own fund, matching the members’ combined contribution.
The group made its first investment in January, supporting the AI-powered geothermal exploration and development company Zanskar’s Series C round. This week, it announced its second: a $22 million commitment to low-carbon cement startup Terra CO2, bringing the company’s Series B total to $147 million. Cement production accounts for roughly 8% of global emissions, a figure Terra aims to shrink by making so-called "supplementary cementitious materials” — which can partially displace traditional cement in concrete mixes — from abundant silicate rocks. By grinding and thermally processing these rocks into a glassy powder, Terra’s product mimics the properties of conventional cement. The company says it can replace up to 50% of the cement in typical concrete mixes, lowering associated emissions by as much as 70%.
The new funding will help Terra build its first commercial-scale plant in Texas, exactly the type of first-of-a-kind project that the coalition was designed to support. But the scale of this challenge remains clear. As noted in ImpactAlpha’s coverage, the coalition has raised just $100 million toward its goal of a $300 million fund — already a relatively modest goal considering the capital intensity of novel infrastructure projects. Bloomberg previously reported that the group aimed to raise the full amount by the end of October 2025, raising questions about the willingness of LPs to bet on projects at this crucial but capital-intensive juncture.
When I think about repurposing used electric vehicle batteries for stationary storage, I think of battery recycling giant Redwood Materials, which raised a $425 million Series E in January after moving aggressively into this promising market. But while Redwood’s well-established recycling business certainly provides it with the largest pipeline of used batteries, it’s far from the only company pursuing this business model. A smaller player with a largely similar approach underscored that this week, when it announced a $40 million Series B to scale its gigafactory in Texas and expand its facilities in British Columbia.
That’s Moment Energy, which focuses on using second-life EV batteries to power commercial and industrial sites such as data centers, hospitals, and factories. Like Redwood, it relies on proprietary software to aggregate battery packs with myriad chemistries and design specs into coordinated grid-scale systems. What the company sees as its critical differentiator, however, is its safety standards. Moment has achieved UL certification, a key safety benchmark that it says others in the industry have yet to meet.
In a shot at its competitors, the company described itself in a press release as the “only provider proven capable of deploying second-life battery storage systems in the built environment without special dispensations or regulatory loopholes.” While Moment never names names, Redwood’s first commercial-scale system sits on its own private land in an open air setting, where certification is arguably unnecessary. “What most other second life [battery] companies are now trying to say is, let’s just lobby to make second life UL certification easier, because it is impossible to get UL certification, as it stands,” the company’s CEO, Edward Chiang, told TechCrunch. “But at Moment, we say that’s not true. We got it.”
As I wrote last September, it’s a good time to be a critical minerals startup, because as you may have heard, “critical minerals are the new oil.” These materials sit at the center of modern energy infrastructure — batteries, magnets, photovoltaic cells, and electrical wiring, to name just a few uses — plus their supply is concentrated in geopolitically tense regions and subject to extreme price volatility. It also certainly doesn’t hurt that the Trump administration loves them and wants to mine and refine way more of them in the U.S.
The latest beneficiary of this enthusiasm is Magrathea, which this week raised a $24 million Series A to build what it says will be the only new magnesium smelter in the U.S., in Arkansas. The company has now raised over $100 million in total, including a $28 million grant from the Department of Defense. Its approach relies on an electrolysis-based process that’s able to extract pure magnesium from seawater and brines, which it positions as a cleaner, cheaper alternative to the high-heat, emission-intensive method that China uses to produce most of the world’s magnesium today.
The U.S. military has taken note of this potential new domestic supply. Magrathea’s 2022 seed round coincided with Russia’s invasion of Ukraine, as the military looked to scale domestic defense tech supply chains. Magnesium alloys are often used to help reduce weight in EV components, a benefit equally applicable to military helicopters, drones, and next-generation fighter jets. So while these defense applications represent somewhat of a pivot from the startup’s initial focus, a greener fighter jet is still better than a dirty fighter jet.