You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:

“I am increasingly becoming irrelevant in the public conversation,” says Kate Marvel, a climate scientist who until recently worked at NASA’s Goddard Institute for Space Studies. “And I love it.”
For years, such an exalted state was denied to Marvel. Every week, it seemed, someone — a high-profile politician, maybe, or a CEO — would say something idiotic about climate science. Journalists would dutifully call her to get a rebuttal: Yes, climate change is real, she would say, yes, we’re really certain. The media would print the story. Rinse, repeat.
A few years ago, she told a panel, half as a joke, that her highest professional ambition was not fame or a Nobel Prize but total irrelevance — a moment when climate scientists would no longer have anything useful to tell the public.
That 2020 dream is now her 2023 reality. “It’s incredible,” she told me last week. “Science is no longer even a dominant part of the climate story anymore, and I think that’s great. I think that represents just shattering progress.”
We were talking about a question, a private heresy, I’ve been musing about for some time. Because it’s not just the scientists who have faded into the background — over the past few years, the role of climate science itself has shifted. Gradually, then suddenly, a field once defined by urgent questions and dire warnings has become practical and specialized. So for the past few weeks, I’ve started to ask researchers my big question: Have we reached the end of climate science?
“Science is never done,” Michael Oppenheimer, a professor of geosciences and international affairs at Princeton, told me. “There’s always things that we thought we knew that we didn’t.”
“Your title is provocative, but not without basis,” Katharine Hayhoe, a climate scientist at Texas Tech University and one of the lead authors of the National Climate Assessment, said.
Not necessarily no, then. My question, I always clarified, had a few layers.
Since it first took shape, climate science has sought to answer a handful of big questions: Why does Earth’s temperature change so much across millennia? What role do specific gases play in regulating that temperature? If we keep burning fossil fuels, how bad could it be — and how hot could it get?
The field has now answered those questions to any useful degree. But what’s more, scientists have advocated and won widespread acceptance of the idea that inevitably follows from those answers, which is that humanity must decarbonize its economy as fast as it reasonably can. Climate science, in other words, didn’t just end. It reached its end — its ultimate state, its Really Big Important Point.
In the past few years, the world has begun to accept that Really Big Important Point. Since 2020, the world’s three largest climate polluters — China, the United States, and the European Union — have adopted more aggressive climate policies. Last year, the global clean-energy market cracked $1 trillion in annual investment for the first time; one of every seven new cars sold worldwide is now an electric vehicle. In other words, serious decarbonization — the end of climate science — has begun.
At the same time, climate science has resolved some of its niggling mysteries. When I became a climate reporter in 2015, questions still lingered about just how bad climate change would be. Researchers struggled to understand how clouds or melting permafrost fed back into the climate system; in 2016, a major paper argued that some Antarctic glaciers could collapse by the end of the century, leading to hyper-accelerated sea-level rise within my lifetime.
Today, not all of those questions have been completely put aside. But scientists now have a better grasp of how clouds work, and some of the most catastrophic Antarctic scenarios have been pushed into the next century. In 2020, researchers even made progress on one of the oldest mysteries in climate science — a variable called “climate sensitivity” — for the first time in 41 years.
Does the field have any mysteries left? “I wouldn’t go quite so far as angels dancing on the head of a pin” to describe them, Hayhoe told me. “But in order to act, we already know what we need.”
“I think at the macro level, what we discover [next] is not necessarily going to change policymakers’ decisions, but you could argue that’s been true since the late 90s,” Zeke Hausfather, a climate scientist at Berkeley Earth, agreed.
“Physics didn’t end when we figured out how to do engineering, and now they are both incredibly important,” Marvel said.
Yet across the discipline, you can see research switching their focus from learning to building — from physics, as it were, to engineering. Marvel herself left NASA last year to join Project Drawdown, a nonprofit that focuses on emissions reduction. Hausfather now works at Frontier, a tech-industry consortium that studies carbon-removal technology. Even Hayhoe — who trained as a climate scientist — joined a political-science department a decade ago. “I concluded that the biggest barriers to action were not more science,” she said this week.
To fully understand whether climate science has ended, it might help to go back to the very beginning of the field.
By the late 19th century, scientists knew that Earth was incredibly ancient. They also knew that over long enough timescales, the weather in one place changed dramatically. (Even the ancient Greeks and Chinese had noticed misplaced seashores or fossilized bamboo and figured out what they meant.) But only slowly did questions from chemistry, physics, and meteorology congeal into a new field of study.
The first climate scientist, we now know, was Eunice Newton Foote, an amateur inventor and feminist. In 1856, she observed that glass jars filled with carbon dioxide or water vapor trapped more of the sun’s heat than a jar containing dry air. “An atmosphere of that gas,” she wrote of CO₂, “would give to our earth a high temperature.”
But due to her gender and nationality, her work was lost. So the field began instead with the contributions of two Europeans: John Tyndall, an Irish physicist who in 1859 first identified which gases cause the greenhouse effect; and Svante Arrhenius, a Swedish chemist who in 1896 first described Earth’s climate sensitivity, perhaps the discipline’s most important number.
Arrhenius asked: If the amount of CO₂ in the atmosphere were to double, how much would the planet warm? Somewhere from five to six degrees Celsius, he concluded. Although he knew that humanity’s coal consumption was causing carbon pollution, his calculation was a purely academic exercise: We would not double atmospheric CO₂ for another 3,000 years.
In fact, it might take only two centuries. Atmospheric carbon-dioxide levels are now 50 percent higher than they were when the Industrial Revolution began — we are halfway to doubling.
Not until after World War II did climate science become an urgent field, as nuclear war, the space race, and the birth of environmentalism forced scientists to think about the whole Earth system for the first time — and computers made such a daring thing possible. In the late 1950s and 1960s, the physicists Syukuro Manabe and Richard Wetherald produced the first computer models of the atmosphere, confirming that climate sensitivity was real. (Last year, Manabe won the Nobel Prize in Physics for that work.) Half a hemisphere away, the oceanographer Charles Keeling used data collected from Hawaii’s Mauna Loa Observatory to show that fossil-fuel use was rapidly increasing the atmosphere’s carbon concentration.
Suddenly, the greenhouse effect — and climate sensitivity — were no longer theoretical. “If the human race survives into the 21st century,” Keeling warned, “the people living then … may also face the threat of climatic change brought about by an uncontrolled increase in atmospheric CO₂ from fossil fuels.”
Faced with a near-term threat, climate science took shape. An ever-growing group of scientists sketched what human-caused climate change might mean for droughts, storms, floods, glaciers, and sea levels. Even oil companies opened climate-research divisions — although they would later hide this fact and fund efforts to discredit the science. In 1979, the MIT meteorologist Jules Charney led a national report concluding that global warming was essentially inevitable. He also estimated climate sensitivity at 1.5 to 4 degrees Celsius, a range that would stand for the next four decades.
“In one sense, we’ve already known enough for over 50 years to do what we have to do,” Hayhoe, the Texas Tech professor, told me. “Some parts of climate science have been simply crossing the T’s and dotting the I’s since then.”
Crossing the T’s and dotting the I’s—such an idea would have made sense to the historian Thomas Kuhn. In his book, The Structure of Scientific Revolutions, he argued that science doesn’t progress in a dependable and linear way, but through spasmodic “paradigm shifts,” when a new theory supplants an older one and casts everything that scientists once knew in doubt. These revolutions are followed by happy doldrums that he called “normal science,” where researchers work to fit their observations of the world into the moment’s dominant paradigm.
By 1988, climate science had advanced to the degree that James Hansen, the head of NASA’s Goddard Institute, could confidently warn the Senate that global warming had begun. A few months later, the United Nations convened the first Intergovernmental Panel on Climate Change, an expert body of scientists asked to report on current scientific consensus.
Yet core scientific questions remained. In the 1990s, the federal scientist Ben Santer and his colleagues provided the first evidence of climate change’s “fingerprint” in the atmosphere — key observations that showed the lower atmosphere was warming in such a way as to implicate carbon dioxide.
By this point, any major scientific questions about climate change were effectively resolved. Paul N. Edwards, a Stanford historian and IPCC author, remembers musing in the early 2000s about whether the IPCC’s physical-science team should pack it up: They had done the job and shown that climate change was real.
Yet climate science had not yet won politically. Santer was harassed over his research; fossil-fuel companies continued to seed lies and doubt about the science for years. Across the West, only some politicians acted as if climate change was real; even the new U.S. president, Barack Obama, could not get a climate law through a liberal Congress in 2010.
It took one final slog for climate science to win. Through the 2010s, scientists ironed out remaining questions around clouds, glaciers, and other runaway feedbacks. “It’s become harder in the last decade to make a publicly skeptical case against mainstream climate science,” Hausfather said. “Part of that is climate science advancing one funeral at a time. But it’s also become so clear and self-evident — and so much of the scientific community supports it — that it’s harder to argue against with any credibility.”
Three years ago, a team of more than two dozen researchers — including Hausfather and Marvel — finally made progress on solving climate science’s biggest outstanding mystery, cutting our uncertainty around climate sensitivity in half. Since 1979, Charney’s estimate had remained essentially unchanged; it was quoted nearly verbatim in the 2013 IPCC report. Now, scientists know that if atmospheric CO₂ were to double, Earth’s temperature would rise 2.6 to 3.9 degrees Celsius.
That’s about as much specificity as we’ll ever need, Hayhoe told me. Now, “we know that climate sensitivity is either bad, really bad, or catastrophic.”
So isn’t climate science over, then? It’s resolved the big uncertainties; it’s even cleared up climate sensitivity. Not quite, Marvel said. She and other researchers described a few areas where science is still vital.
The first — and perhaps most important — is the object that covers two-thirds of Earth’s surface area: the ocean, Edwards told me. Since the 1990s, it has absorbed more than 90% of the excess heat caused by greenhouse gases, but we still don’t understand how it formed, much less how it will change over the next century.
Researchers also know some theories need to be revisited. “Antarctica is melting way faster than in the models,” Marvel said, which could change the climate much more quickly than previously imagined. And though the runaway collapse of Antarctica now seems less likely, we could be wrong, Oppenheimer reminded me. “The money that we put into understanding Antarctica is a pittance compared to what you would need to truly understand such a big object,” he said.
And these, mind you, are the known unknowns. There’s still the chance that we discover some huge new climatic process out there — at the bottom of the Mariana Trench, perhaps, or at the base of an Antarctic glacier — that has so far eluded us.
Yet in the wildfires of the old climate science, a new field is being born. The scientists who I spoke with see three big projects.
First, in the past decade, researchers have gotten much better at attributing individual weather events to climate change. They now know that the Lower 48 states are three times more likely to see a warm February than they would without human-caused climate change, for instance, or that Oregon and Washington’s record-breaking 2021 heat wave was “virtually impossible” without warming. This work will keep improving, Marvel said, and it will help us understand where climate models fail to predict the actual experience of climate change.
Second, scientists want to make the tools of climate science more useful to people at the scales where they live, work, and play. “We just don’t yet have the ability to understand in a detailed way and at a small-enough scale” what climate impacts will look like, Oppenheimer told me. Cities should be able to predict how drought or sea-level rise will affect their bridges or infrastructure. Members of Congress should know what a once-in-a-decade heat wave will look like in their district five, 10, or 20 years hence.
“It’s not so much that we don’t need science anymore; it’s that we need science focused on the questions that are going to save lives,” Oppenheimer said. The task before climate science is to steward humanity through the “treacherous next decades where we are likely to warm through the danger zone of 1.5 degrees.”
That brings us to the third project: That climatologists must create a “smoother interface between physical science and social science,” he said. The Yale economist Richard Nordhaus recently won a Nobel Prize for linking climate science with economics, “but other aspects of the human system are still totally undone.” Edwards wanted to get beyond economics altogether: “We need an anthropology and sociology of climate adaptation,” he said. Marvel, meanwhile, wanted to zoom the lens beyond just people. “We don’t really understand ... what the hell plants do,” she told me. Plants and plankton have absorbed half of all carbon pollution, but it’s unclear if they’ll keep doing so or how all that extra carbon has changed how they might respond to warming.
Economics, sociology, botany, politics — you can begin to see a new field taking shape here, a kind of climate post-science. Rooted in climatology’s theories and ideas, it stretches to embrace the breadth of the Earth system. The climate is everything, after all, and in order to survive an era when human desire has altered the planet’s geology, this new field of study must encompass humanity itself — and all the rest of the Earthly mess.
Nearly a century ago, the philosopher Alexander Kojéve concluded it was possible for political philosophy to gain a level of absolute knowledge about the world and, second, that it had done so. In the wake of the French Revolution, some fusion of socialism or capitalism would win the day, he concluded, meaning that much of the remaining “work to do” in society lay not in large-scale philosophizing about human nature, but in essentially bureaucratic questions of economic and social governance. So he became a technocrat, and helped design the market entity that later became the European Union.
Is this climate science’s Kojéve era? It just may be — but it won’t last forever, Oppenheimer reminded me.
“Generations in the future will still be dealing with this problem,” he said. “Even if we get off fossil fuels, some future idiot genius will invent some other climate altering substance. We can never put climate aside — it’s part of the responsibility we inherited when we started being clever enough to invent problems like this in future.”
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Giving up on hourly matching by 2030 doesn’t mean giving up on climate ambition — necessarily.
Microsoft celebrated a “milestone achievement” earlier this year, when it announced that it had successfully matched 100% of its 2025 electricity usage with renewable energy. This past week, however, Bloomberg reported that the company was considering delaying or abandoning its next clean energy target set for 2030.
What comes after achieving 100% renewable energy, you might ask? What Microsoft did in 2025 was tally its annual energy consumption and purchase an equal amount of solar and wind power. By 2030, the company aspired to match every kilowatt it consumes with carbon-free electricity hour by hour. That means finding clean power for all the hours when the sun isn’t shining and the wind isn’t blowing.
The news that Microsoft is revisiting this goal could be read as the beginning of the end of corporate climate ambition. Microsoft has long been a pioneer on that front, setting increasingly difficult goals and then doing the groundwork to help others follow in its footsteps. Now it appears to be accepting defeat. The news comes just weeks after my colleague Robinson Meyer broke the news that the company is also pausing its industry-leading carbon removal purchasing program.
Delaying or abandoning the clean energy target — the two options presented in the Bloomberg story — represent quite different scenarios, however.
“There’s going to be a big difference between them saying, We’re going to keep trying as hard as we can to go as far as we can, but acknowledge we may not hit it, versus saying, Well, we can’t hit this extremely ambitious goal we set for ourselves, therefore we’re just giving up on the overall mission,” Wilson Ricks, a manager in Clean Air Task Force’s electricity program, told me.
The goal was always going to be difficult, if not impossible, for Microsoft to hit, Ricks said. Yes, it’s gotten tougher as Microsoft’s electricity usage has surged with the rise of artificial intelligence, and because Congress killed subsidies for clean energy as the Trump administration has done its best to stall wind and solar development. But some of the technologies likely needed to achieve the goal, such as advanced nuclear and geothermal power plants, have yet to achieve commercial deployment, let alone reach meaningful scale, and probably won’t by 2030 — especially not across all the regions that Microsoft operates in.
Nonetheless, some clean energy advocates (including Ricks) argue that keeping hourly matching as a north star is paramount because it helps put the world on the path to fully decarbonized electric grids.
Google was the first to introduce a 24/7 carbon-free energy strategy in 2020, and for a moment, it seemed that the rest of the corporate world would follow. A handful of companies joined a coalition to support the goal, but to date, I’m aware of just two — Microsoft and the data storage company Iron Mountain — that have followed Google in committing to achieving it.
Most companies approach their clean energy claims with considerably less precision. The norm is to purchase “unbundled” renewable energy certificates, tradeable vouchers that say a certain amount of renewable energy has been generated somewhere, at some point, and that the certificate owner can lay claim to it. Many simply buy enough of these RECs to cover their annual electricity usage and call themselves “powered by 100% renewable energy.”
There’s a spectrum of quality in the RECs available for purchase, but the market is flooded with cheap, relatively meaningless certificates. A company that operates in a coal-heavy region like Indiana can buy RECs from a wind farm in Texas that was built a decade ago, which won’t do anything to change the makeup of the grid in either place.
Today, the gold standard for companies with capital to throw around is instead to seek out long-term contracts directly with wind and solar developers known as power purchase agreements. That doesn’t mean the wind and solar farms send power to the companies directly. But these types of contracts are more likely to bring new projects onto the grid by providing guaranteed future revenues, helping developers secure the financing they need to build.
Microsoft started buying unbundled RECs more than a decade ago, and in 2014, it reported it had matched all of its global electricity usage. In 2016, the company began setting goals for direct procurement of renewable energy. In 2020, it pledged to achieve 100% renewable this way by 2025 — but it wasn’t going to sign just any wind or solar agreements. It aimed to pursue contracts with projects that were in the same regions as the company’s operations and that wouldn’t have been built without the company’s support. “Where and how you buy matters,” it wrote in its 2020 sustainability report. “The closer the new wind or solar farm is to your data center, the more likely it is those zero carbon electrons are powering it.”
In 2021, Microsoft upped the ante again by establishing its 2030 hourly matching target, which it referred to as “100/100/0” — 100% of electrons, 100% of the time, zero-carbon energy.
Microsoft has never publicly reported its progress toward the 2030 goal. The company’s enthusiasm for the target has also appeared to wane. In 2020, before Microsoft even made the 100/100/0 commitment, it touted a solution it developed to track and match renewable energy generation and consumption on an hourly basis. In the years since, it has led its peers in investments in round-the-clock nuclear power, even signing a 20-year power purchase agreement with Constellation Energy to bring the shuttered Three Mile Island nuclear plant in Pennsylvania back online.
But Microsoft has stopped publicizing the goal in blog posts and press releases. It went unmentioned in the recent announcement about the 2025 renewable energy achievement, for instance. And a section in the company’s annual sustainability report listing its climate targets that had previously advertised the 2030 goal as “Replacing with 100/100/0 carbon-free energy” was re-written in 2025 as “Expanding carbon-free electricity,” fuzzier rhetoric that now reads as a harbinger of a softer approach.
Microsoft did not respond to questions about its progress toward the 2030 target. In an emailed statement, a spokesperson emphasized the company’s commitment to maintaining its annual matching goal — the one achieved in 2025. No doubt that will take a lot more investment in the years to come now that the company is gobbling up a lot more electricity for data centers — some of it directly from natural gas plants.
Microsoft also shared a statement from Melanie Nakagawa, Microsoft’s chief sustainability officer, emphasizing the company’s commitment to become carbon negative. “At times we may make adjustments to our approach toward our sustainability goals,” she said. “Any adjustments we make are part of our disciplined approach—not a change in our long-term ambition.”
Even if Microsoft axes its hourly matching target, the company might have to start reporting its clean electricity usage on an hourly basis anyway. The Greenhouse Gas Protocol, a nonprofit that sets standards for how companies should calculate their emissions, is currently considering adopting an hourly accounting requirement. While the protocol’s standards are voluntary, companies almost uniformly follow them, and they will soon become mandatory in much of the world, as governments in California and Europe plan to integrate them into corporate disclosure rules.
The accounting rule change is highly controversial, with many companies arguing that it will deter them from investing in clean energy altogether, since their purchases won’t look as good on paper. “I don’t think anybody is debating having rules and guidelines around how you do more narrow matching, we should have that,” Michael Leggett, the co-founder and chief product officer for Ever.Green, a company that sells high-impact RECs, told me. “I think the debate has largely been around, is that required?”
Leggett said he could see how Microsoft’s pullback could be twisted to support either side. Proponents of the hourly accounting method will say, “Aha! See? This is why we have to require it.” Opponents will say, “See, even Microsoft can’t do it, so how are you going to require all these other companies to do it?”
I spoke to Alex Piper, the head of U.S. policy and markets at EnergyTag, a nonprofit that advocates for reforms to enable 24/7 clean energy, who saw the news as vindicating.
“What we’re seeing right now is many of the hyperscale technology companies look to the fastest path to power, and whether it is or not, some of them are turning to gas as that solution,” he told me. Piper argued that companies are choosing natural gas in part because they can get away with clean energy claims under the protocol’s existing rules. “The proposed rules for the greenhouse gas protocol would require those companies to at least be transparent.”
But Microsoft walking back its hourly matching goal does not have to mean that it’s walking back its climate ambition. It’s possible for companies to achieve significant emissions reductions by focusing their clean energy purchases on the places where wind and solar will do the most to displace fossil fuels, rather than worrying about matching every hour. For a company that operates in California, for example, supporting the addition of solar power to a coal-heavy grid — even if it’s in a different part of the country or the world — will do more, faster, than helping to build solar locally or waiting for around-the-clock resources such as geothermal power to come online.
Critics of hourly accounting argue that it doesn’t give companies credit for this kind of approach. “What I would love to have happen is anything to incentivize, recognize, and reward companies signing 20-year contracts that enable new projects coming online,” Leggett said of the Greenhouse Gas Protocol’s forthcoming rule change.
Ricks, of Clean Air Task Force, rejects the idea that an hourly accounting requirement would deter these kinds of deals. “That doesn’t mean that they can’t report any other set of numbers they want to,” he said. “Many companies do report things that aren’t currently recognized in the Greenhouse Gas Protocol.”
Microsoft is a prime example. The company includes two measures of its renewable energy usage in its annual reports: “percentage of renewable electricity,” which includes the unbundled RECs Microsoft has continued to buy over the years, and “percentage of direct renewable electricity,” which tracks power purchase agreements and the renewable portion of the grid mix where its facilities are located. The former uses the Greenhouse Gas protocol’s current accounting method, under which Microsoft says it has hit 100% every year since 2014. But the latter is the company’s own bespoke calculation.
The company’s 2025 feat was based on this made-up methodology, and it represents the first time Microsoft has announced to the world that it used 100% renewable energy. It never previously made such claims about its REC purchases, as far as I can tell. In other words, Microsoft’s standards for what it publicizes are far more rigorous than what the Greenhouse Gas Protocol requires.
Regardless of what the protocol decides, it will determine only what companies must report. It won’t prevent them from offering up their own, additional metrics of success.
PJM Interconnection has some ideas, as does the state of New Jersey.
We’ve already talked this week about Pennsylvania asking whether the modern “regulatory compact,” which grants utilities monopoly geographical franchises and regulated returns from their capital investments, is still suitable in this era of rising prices and data-center-driven load growth.
Now America’s biggest electricity market and another one of that market’s biggest states are considering far-reaching, fundamental reforms that could alter how electricity infrastructure is planned and paid for over 65 million Americans.
New Jersey Governor Mikie Sherrill anchored her 2025 campaign on electricity prices, and for good reason — in the past four years, electricity prices in the state have gone up 48%, according to Heatmap and MIT’s Electricity Price Hub, while average bills have risen from $83 per month to $130. On her first day in office, Sherrill issued two executive orders acting on that promise, directing the state to make funds available to freeze rates and declaring a state of emergency to ease the way to building more generation.
Included in that first order was a review of utility business models to be carried out by state regulators. What that review will entail is now coming into focus.
On Wednesday, the New Jersey Board of Public Utilities issued a statement announcing that it will look specifically at “whether New Jersey’s century-old utility business model — one that rewards electric distribution companies (EDCs) for capital spending even when cheaper alternatives exist — should be replaced with a framework tied to performance, affordability, and long-term cost stability.” In case anyone was still ambiguous as to what the outcome of said study might be, the board added that it is “expected to drive the most significant restructuring of utility regulation in New Jersey in decades.”
The current system, the board’s president Christine Guhl-Savoy said at a hearing Thursday, “creates a structural incentive to favor capital intensive solutions, even when lower costs, non-wires or demand side alternatives may be available.”
This structure, she said, could help explain why “over the past decade, electric delivery charges in New Jersey have risen steadily.” Within the service territory of PSEG, one of the four major New Jersey utilities, distribution charges alone have risen from $19.24 per month in January 2020 (as far back as the Heatmap-MIT data goes) to $21.84 as of April, while transmission charges have risen from around $20 to just over $29 per month. Many critics of the utility business model point to high levels of local grid spending on distribution as a way that utilities pad their earnings with returns harvested from ratepayers.
In the system regulators explored at the hearing, new projects would get a more skeptical look and ratepayers payouts would be partially determined by utilities hitting pre-defined service goals. NJBPU executive director Bob Brabston also indicated that the review process would take a close look at utilities’ regulated returns on equity — echoing his neighbor across the Delaware River, Pennsylvania Governor Josh Shapiro, who wrote in a letter to his state’s utilities earlier this week that these returns must be “transparent” and “justifiable,” and no longer be based on “educated guesses.”
“We want to make sure that the actual cost of equity and the returns on equity are close,” Brabston said Thursday. “We don’t want there to be a significant gap between the cost of equity that you all experience and the returns that the agencies that the agency awards.”
Meanwhile, in Valley Forge, Pennsylvania, the framework within which New Jersey’s utilities exist is coming in for its own examination.
PJM Interconnection — the nation’s largest electricity market, which covers not just Pennsylvania and New Jersey but also part or all of 11 other states — released an almost 70-page paper Wednesday, in which the organization’s president David Mills wrote that “the current situation is not tenable.”
PJM has been the poster child for a host of issues plaguing the electricity markets across the country, including fast-rising prices, a failure to quickly bring on new generation, and an inability to assure the market’s preferred level of reserve reliability. This set of challenges, Mills said in the paper’s introduction, “reflects something more fundamental than a design that needs recalibration.” Instead, PJM must consider “whether the foundational assumptions of the market remain valid – and if not, what a valid set of assumptions would require.”
The problem with the electricity market, he argued, can be solved by more markets. Right now, when prices shoot up, governments intervene with price caps, suppressing the market signal necessary to bring on sufficient generation that would bring down prices.
To replace that system, the paper proposes three possible models. The first, which it calls “Stabilized Markets,” would allow capacity to be procured for several years at a time outside of the current auction system, so that utilities could make sure their basic needs were covered before they go into the annual auctions. This would provide long term security for new investment.
The second path would be a more fundamental reform. This “Differential Reliability” approach would do away with the “shared reliability compact,” under which all loads must be served by the system at all times. Instead, PJM would “develop the operational and commercial framework to explicitly differentiate reliability,” incentivizing approaches like bring your own generation or curtailing power for new large sources of demand.
The third path is an “Energy Market Transition,” which might also be called the “Texas option.” Following this path, the capacity market would shrink as a portion of revenues earned by generators, and more revenue would come from real-time or near-real-time electricity sales.
While this path isn’t “full Texas” (ERCOT doesn’t have a capacity market at all), it would mean allowing for higher prices for energy in real-time, a.k.a. “scarcity pricing” which is arguably the defining feature of the ERCOT system (though even that was scaled back when prices got too high).
“The choices embedded in these paths involve genuine trade-offs, and those trade-offs affect different stakeholders uniquely,” the paper says.If PJM has learned anything in the past few years, it’s that it doesn’t get to make decisions on its own. Those stakeholders will get their say, one way or another.
Big fundraises for Nyobolt and Skeleton Technologies, plus more of the week’s biggest money moves.
Following a quiet week for new deals, the industry is back at it with a bunch of capital flowing into some of the industry’s most active areas. My colleague Alexander C. Kaufman already told you about one of the more buzzworthy announcements from data center-land in Wednesday’s AM newsletter: Wave energy startup Panthalassa raised $140 million in a round led by Peter Thiel to “perform AI inference computing at sea” using nodes powered by the ocean’s waves.
This week also saw fresh funding for more conventional data center infrastructure, as Nyobolt and Skeleton Technologies both announced later-stage rounds for data center backup power solutions. Meanwhile, it turns out Redwood Materials is not the only company bringing in significant capital for second-life EV battery systems — Moment Energy just raised $40 million to pursue a similar approach. Elsewhere, investors backed an effort to rebuild domestic magnesium production, and, in a glimmer of hope for a sector on the outs, gave a boost to green cement startup Terra CO2.
Cambridge-based startup Nyobolt has become the latest battery company to reach a $1 billion valuation, with its expansion into the data center market helping fuel excitement around its tech. Spun out of University of Cambridge research in 2019, the company develops ultra-fast-charging batteries based on a modified lithium-ion chemistry. Its core innovation is an anode made from niobium tungsten oxide, which Nyobolt says enables its batteries to charge to 80% in less than five minutes, with a cycle life that’s 10 times longer than conventional lithium-ion, all without the risk of fire.
The company has now raised a $60 Series C, following what it describes as a period of “rapid commercial momentum,” with revenue increasing five-fold year-over-year as customers in the robotics and data center industry piled in. Symbotic, an autonomous robotics company and existing customer, led the latest round. While Symbotic previously relied on supercapacitors to power its robots, Nyobolt’s says its batteries provide six times more energy capacity in a lighter package, allowing its warehouse robots to work for retailers like Walgreens, Target, and Kroger around the clock.
Now the startup is targeting data center customers too, positioning its tech as a fast-acting fix for the sudden power surges common to large-scale artificial intelligence workloads, as well as a temporary backup power solution for outages. While it has no confirmed domestic data center customers to date, it does have a nonbinding agreement with the Indian state of Rajasthan to deploy over 100 megawatts of off-grid AI data center and power management infrastructure, part of a broader push to expand its presence across the country.
Notably, the press release made no mention of plans to sell its tech to electric vehicle automakers, though this appears to have been a central focus previously. As recently as last summer, executive vice president Ramesh Narasimhan told the BBC that he hoped Nyobolt’s batteries would “transform the experience of owning an EV.” But while its tech does enable extremely fast charging, its underlying chemistry is not optimized for long-range driving. A sports car built to test the company’s batteries had just a 155 mile range. So like many of its climate tech peers, the company appears to be betting that data centers now represent a more reliable opportunity.
This week brought additional news from another European player aiming to smooth out data center power surges. Estonia-based supercapacitor startup Skeleton Technologies raised $39 million in what it describes as the first close of a pre-IPO funding round, with a U.S. listing planned for next year. Its core tech is built around a “curved graphene” structure, which the company likens to a crumpled sheet of paper with a high surface area. The graphene’s many exposed surfaces and edges allows it to hold more electric charge, which Skeleton says delivers a 72% improvement in energy density.
Like Nyobolt, Skeleton says its tech offers faster response times and longer cycle life. But supercapacitors are a fundamentally different technology than Nyobolt’s modified lithium-ion solution. Though they offer near-instantaneous response times, they store very little energy — just enough to smooth out microsecond power spikes in GPU workloads. Nyobolt’s batteries, by contrast, aim not only to smooth out data center power spikes, but also to deliver about 90 seconds of backup power in the case of an outage, before a generator or other backup source kicks in.
Skeleton is already mass-producing supercapacitors in Germany and delivering to unnamed “major U.S. hyperscalers for AI infrastructure.” It’s also making moves to expand its U.S. footprint ahead of its pending IPO, opening an engineering facility in Houston and aiming to begin domestic manufacturing of AI data center solutions in the first half of this year.
Last year brought a wave of new climate tech coalitions, with one of the most ambitious efforts known as the All Aboard Coalition. This group of venture firms is targeting the investment gap known as the missing middle, which falls between early-stage venture rounds and infrastructure funding. The model is relatively mechanical: When three or more member firms participate in a later-stage round for a company, the coalition automatically coinvests out of its own fund, matching the members’ combined contribution.
The group made its first investment in January, supporting the AI-powered geothermal exploration and development company Zanskar’s Series C round. This week, it announced its second: a $22 million commitment to low-carbon cement startup Terra CO2, bringing the company’s Series B total to $147 million. Cement production accounts for roughly 8% of global emissions, a figure Terra aims to shrink by making so-called "supplementary cementitious materials” — which can partially displace traditional cement in concrete mixes — from abundant silicate rocks. By grinding and thermally processing these rocks into a glassy powder, Terra’s product mimics the properties of conventional cement. The company says it can replace up to 50% of the cement in typical concrete mixes, lowering associated emissions by as much as 70%.
The new funding will help Terra build its first commercial-scale plant in Texas, exactly the type of first-of-a-kind project that the coalition was designed to support. But the scale of this challenge remains clear. As noted in ImpactAlpha’s coverage, the coalition has raised just $100 million toward its goal of a $300 million fund — already a relatively modest goal considering the capital intensity of novel infrastructure projects. Bloomberg previously reported that the group aimed to raise the full amount by the end of October 2025, raising questions about the willingness of LPs to bet on projects at this crucial but capital-intensive juncture.
When I think about repurposing used electric vehicle batteries for stationary storage, I think of battery recycling giant Redwood Materials, which raised a $425 million Series E in January after moving aggressively into this promising market. But while Redwood’s well-established recycling business certainly provides it with the largest pipeline of used batteries, it’s far from the only company pursuing this business model. A smaller player with a largely similar approach underscored that this week, when it announced a $40 million Series B to scale its gigafactory in Texas and expand its facilities in British Columbia.
That’s Moment Energy, which focuses on using second-life EV batteries to power commercial and industrial sites such as data centers, hospitals, and factories. Like Redwood, it relies on proprietary software to aggregate battery packs with myriad chemistries and design specs into coordinated grid-scale systems. What the company sees as its critical differentiator, however, is its safety standards. Moment has achieved UL certification, a key safety benchmark that it says others in the industry have yet to meet.
In a shot at its competitors, the company described itself in a press release as the “only provider proven capable of deploying second-life battery storage systems in the built environment without special dispensations or regulatory loopholes.” While Moment never names names, Redwood’s first commercial-scale system sits on its own private land in an open air setting, where certification is arguably unnecessary. “What most other second life [battery] companies are now trying to say is, let’s just lobby to make second life UL certification easier, because it is impossible to get UL certification, as it stands,” the company’s CEO, Edward Chiang, told TechCrunch. “But at Moment, we say that’s not true. We got it.”
As I wrote last September, it’s a good time to be a critical minerals startup, because as you may have heard, “critical minerals are the new oil.” These materials sit at the center of modern energy infrastructure — batteries, magnets, photovoltaic cells, and electrical wiring, to name just a few uses — plus their supply is concentrated in geopolitically tense regions and subject to extreme price volatility. It also certainly doesn’t hurt that the Trump administration loves them and wants to mine and refine way more of them in the U.S.
The latest beneficiary of this enthusiasm is Magrathea, which this week raised a $24 million Series A to build what it says will be the only new magnesium smelter in the U.S., in Arkansas. The company has now raised over $100 million in total, including a $28 million grant from the Department of Defense. Its approach relies on an electrolysis-based process that’s able to extract pure magnesium from seawater and brines, which it positions as a cleaner, cheaper alternative to the high-heat, emission-intensive method that China uses to produce most of the world’s magnesium today.
The U.S. military has taken note of this potential new domestic supply. Magrathea’s 2022 seed round coincided with Russia’s invasion of Ukraine, as the military looked to scale domestic defense tech supply chains. Magnesium alloys are often used to help reduce weight in EV components, a benefit equally applicable to military helicopters, drones, and next-generation fighter jets. So while these defense applications represent somewhat of a pivot from the startup’s initial focus, a greener fighter jet is still better than a dirty fighter jet.