You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:

“I am increasingly becoming irrelevant in the public conversation,” says Kate Marvel, a climate scientist who until recently worked at NASA’s Goddard Institute for Space Studies. “And I love it.”
For years, such an exalted state was denied to Marvel. Every week, it seemed, someone — a high-profile politician, maybe, or a CEO — would say something idiotic about climate science. Journalists would dutifully call her to get a rebuttal: Yes, climate change is real, she would say, yes, we’re really certain. The media would print the story. Rinse, repeat.
A few years ago, she told a panel, half as a joke, that her highest professional ambition was not fame or a Nobel Prize but total irrelevance — a moment when climate scientists would no longer have anything useful to tell the public.
That 2020 dream is now her 2023 reality. “It’s incredible,” she told me last week. “Science is no longer even a dominant part of the climate story anymore, and I think that’s great. I think that represents just shattering progress.”
We were talking about a question, a private heresy, I’ve been musing about for some time. Because it’s not just the scientists who have faded into the background — over the past few years, the role of climate science itself has shifted. Gradually, then suddenly, a field once defined by urgent questions and dire warnings has become practical and specialized. So for the past few weeks, I’ve started to ask researchers my big question: Have we reached the end of climate science?
“Science is never done,” Michael Oppenheimer, a professor of geosciences and international affairs at Princeton, told me. “There’s always things that we thought we knew that we didn’t.”
“Your title is provocative, but not without basis,” Katharine Hayhoe, a climate scientist at Texas Tech University and one of the lead authors of the National Climate Assessment, said.
Not necessarily no, then. My question, I always clarified, had a few layers.
Since it first took shape, climate science has sought to answer a handful of big questions: Why does Earth’s temperature change so much across millennia? What role do specific gases play in regulating that temperature? If we keep burning fossil fuels, how bad could it be — and how hot could it get?
The field has now answered those questions to any useful degree. But what’s more, scientists have advocated and won widespread acceptance of the idea that inevitably follows from those answers, which is that humanity must decarbonize its economy as fast as it reasonably can. Climate science, in other words, didn’t just end. It reached its end — its ultimate state, its Really Big Important Point.
In the past few years, the world has begun to accept that Really Big Important Point. Since 2020, the world’s three largest climate polluters — China, the United States, and the European Union — have adopted more aggressive climate policies. Last year, the global clean-energy market cracked $1 trillion in annual investment for the first time; one of every seven new cars sold worldwide is now an electric vehicle. In other words, serious decarbonization — the end of climate science — has begun.
At the same time, climate science has resolved some of its niggling mysteries. When I became a climate reporter in 2015, questions still lingered about just how bad climate change would be. Researchers struggled to understand how clouds or melting permafrost fed back into the climate system; in 2016, a major paper argued that some Antarctic glaciers could collapse by the end of the century, leading to hyper-accelerated sea-level rise within my lifetime.
Today, not all of those questions have been completely put aside. But scientists now have a better grasp of how clouds work, and some of the most catastrophic Antarctic scenarios have been pushed into the next century. In 2020, researchers even made progress on one of the oldest mysteries in climate science — a variable called “climate sensitivity” — for the first time in 41 years.
Does the field have any mysteries left? “I wouldn’t go quite so far as angels dancing on the head of a pin” to describe them, Hayhoe told me. “But in order to act, we already know what we need.”
“I think at the macro level, what we discover [next] is not necessarily going to change policymakers’ decisions, but you could argue that’s been true since the late 90s,” Zeke Hausfather, a climate scientist at Berkeley Earth, agreed.
“Physics didn’t end when we figured out how to do engineering, and now they are both incredibly important,” Marvel said.
Yet across the discipline, you can see research switching their focus from learning to building — from physics, as it were, to engineering. Marvel herself left NASA last year to join Project Drawdown, a nonprofit that focuses on emissions reduction. Hausfather now works at Frontier, a tech-industry consortium that studies carbon-removal technology. Even Hayhoe — who trained as a climate scientist — joined a political-science department a decade ago. “I concluded that the biggest barriers to action were not more science,” she said this week.
To fully understand whether climate science has ended, it might help to go back to the very beginning of the field.
By the late 19th century, scientists knew that Earth was incredibly ancient. They also knew that over long enough timescales, the weather in one place changed dramatically. (Even the ancient Greeks and Chinese had noticed misplaced seashores or fossilized bamboo and figured out what they meant.) But only slowly did questions from chemistry, physics, and meteorology congeal into a new field of study.
The first climate scientist, we now know, was Eunice Newton Foote, an amateur inventor and feminist. In 1856, she observed that glass jars filled with carbon dioxide or water vapor trapped more of the sun’s heat than a jar containing dry air. “An atmosphere of that gas,” she wrote of CO₂, “would give to our earth a high temperature.”
But due to her gender and nationality, her work was lost. So the field began instead with the contributions of two Europeans: John Tyndall, an Irish physicist who in 1859 first identified which gases cause the greenhouse effect; and Svante Arrhenius, a Swedish chemist who in 1896 first described Earth’s climate sensitivity, perhaps the discipline’s most important number.
Arrhenius asked: If the amount of CO₂ in the atmosphere were to double, how much would the planet warm? Somewhere from five to six degrees Celsius, he concluded. Although he knew that humanity’s coal consumption was causing carbon pollution, his calculation was a purely academic exercise: We would not double atmospheric CO₂ for another 3,000 years.
In fact, it might take only two centuries. Atmospheric carbon-dioxide levels are now 50 percent higher than they were when the Industrial Revolution began — we are halfway to doubling.
Not until after World War II did climate science become an urgent field, as nuclear war, the space race, and the birth of environmentalism forced scientists to think about the whole Earth system for the first time — and computers made such a daring thing possible. In the late 1950s and 1960s, the physicists Syukuro Manabe and Richard Wetherald produced the first computer models of the atmosphere, confirming that climate sensitivity was real. (Last year, Manabe won the Nobel Prize in Physics for that work.) Half a hemisphere away, the oceanographer Charles Keeling used data collected from Hawaii’s Mauna Loa Observatory to show that fossil-fuel use was rapidly increasing the atmosphere’s carbon concentration.
Suddenly, the greenhouse effect — and climate sensitivity — were no longer theoretical. “If the human race survives into the 21st century,” Keeling warned, “the people living then … may also face the threat of climatic change brought about by an uncontrolled increase in atmospheric CO₂ from fossil fuels.”
Faced with a near-term threat, climate science took shape. An ever-growing group of scientists sketched what human-caused climate change might mean for droughts, storms, floods, glaciers, and sea levels. Even oil companies opened climate-research divisions — although they would later hide this fact and fund efforts to discredit the science. In 1979, the MIT meteorologist Jules Charney led a national report concluding that global warming was essentially inevitable. He also estimated climate sensitivity at 1.5 to 4 degrees Celsius, a range that would stand for the next four decades.
“In one sense, we’ve already known enough for over 50 years to do what we have to do,” Hayhoe, the Texas Tech professor, told me. “Some parts of climate science have been simply crossing the T’s and dotting the I’s since then.”
Crossing the T’s and dotting the I’s—such an idea would have made sense to the historian Thomas Kuhn. In his book, The Structure of Scientific Revolutions, he argued that science doesn’t progress in a dependable and linear way, but through spasmodic “paradigm shifts,” when a new theory supplants an older one and casts everything that scientists once knew in doubt. These revolutions are followed by happy doldrums that he called “normal science,” where researchers work to fit their observations of the world into the moment’s dominant paradigm.
By 1988, climate science had advanced to the degree that James Hansen, the head of NASA’s Goddard Institute, could confidently warn the Senate that global warming had begun. A few months later, the United Nations convened the first Intergovernmental Panel on Climate Change, an expert body of scientists asked to report on current scientific consensus.
Yet core scientific questions remained. In the 1990s, the federal scientist Ben Santer and his colleagues provided the first evidence of climate change’s “fingerprint” in the atmosphere — key observations that showed the lower atmosphere was warming in such a way as to implicate carbon dioxide.
By this point, any major scientific questions about climate change were effectively resolved. Paul N. Edwards, a Stanford historian and IPCC author, remembers musing in the early 2000s about whether the IPCC’s physical-science team should pack it up: They had done the job and shown that climate change was real.
Yet climate science had not yet won politically. Santer was harassed over his research; fossil-fuel companies continued to seed lies and doubt about the science for years. Across the West, only some politicians acted as if climate change was real; even the new U.S. president, Barack Obama, could not get a climate law through a liberal Congress in 2010.
It took one final slog for climate science to win. Through the 2010s, scientists ironed out remaining questions around clouds, glaciers, and other runaway feedbacks. “It’s become harder in the last decade to make a publicly skeptical case against mainstream climate science,” Hausfather said. “Part of that is climate science advancing one funeral at a time. But it’s also become so clear and self-evident — and so much of the scientific community supports it — that it’s harder to argue against with any credibility.”
Three years ago, a team of more than two dozen researchers — including Hausfather and Marvel — finally made progress on solving climate science’s biggest outstanding mystery, cutting our uncertainty around climate sensitivity in half. Since 1979, Charney’s estimate had remained essentially unchanged; it was quoted nearly verbatim in the 2013 IPCC report. Now, scientists know that if atmospheric CO₂ were to double, Earth’s temperature would rise 2.6 to 3.9 degrees Celsius.
That’s about as much specificity as we’ll ever need, Hayhoe told me. Now, “we know that climate sensitivity is either bad, really bad, or catastrophic.”
So isn’t climate science over, then? It’s resolved the big uncertainties; it’s even cleared up climate sensitivity. Not quite, Marvel said. She and other researchers described a few areas where science is still vital.
The first — and perhaps most important — is the object that covers two-thirds of Earth’s surface area: the ocean, Edwards told me. Since the 1990s, it has absorbed more than 90% of the excess heat caused by greenhouse gases, but we still don’t understand how it formed, much less how it will change over the next century.
Researchers also know some theories need to be revisited. “Antarctica is melting way faster than in the models,” Marvel said, which could change the climate much more quickly than previously imagined. And though the runaway collapse of Antarctica now seems less likely, we could be wrong, Oppenheimer reminded me. “The money that we put into understanding Antarctica is a pittance compared to what you would need to truly understand such a big object,” he said.
And these, mind you, are the known unknowns. There’s still the chance that we discover some huge new climatic process out there — at the bottom of the Mariana Trench, perhaps, or at the base of an Antarctic glacier — that has so far eluded us.
Yet in the wildfires of the old climate science, a new field is being born. The scientists who I spoke with see three big projects.
First, in the past decade, researchers have gotten much better at attributing individual weather events to climate change. They now know that the Lower 48 states are three times more likely to see a warm February than they would without human-caused climate change, for instance, or that Oregon and Washington’s record-breaking 2021 heat wave was “virtually impossible” without warming. This work will keep improving, Marvel said, and it will help us understand where climate models fail to predict the actual experience of climate change.
Second, scientists want to make the tools of climate science more useful to people at the scales where they live, work, and play. “We just don’t yet have the ability to understand in a detailed way and at a small-enough scale” what climate impacts will look like, Oppenheimer told me. Cities should be able to predict how drought or sea-level rise will affect their bridges or infrastructure. Members of Congress should know what a once-in-a-decade heat wave will look like in their district five, 10, or 20 years hence.
“It’s not so much that we don’t need science anymore; it’s that we need science focused on the questions that are going to save lives,” Oppenheimer said. The task before climate science is to steward humanity through the “treacherous next decades where we are likely to warm through the danger zone of 1.5 degrees.”
That brings us to the third project: That climatologists must create a “smoother interface between physical science and social science,” he said. The Yale economist Richard Nordhaus recently won a Nobel Prize for linking climate science with economics, “but other aspects of the human system are still totally undone.” Edwards wanted to get beyond economics altogether: “We need an anthropology and sociology of climate adaptation,” he said. Marvel, meanwhile, wanted to zoom the lens beyond just people. “We don’t really understand ... what the hell plants do,” she told me. Plants and plankton have absorbed half of all carbon pollution, but it’s unclear if they’ll keep doing so or how all that extra carbon has changed how they might respond to warming.
Economics, sociology, botany, politics — you can begin to see a new field taking shape here, a kind of climate post-science. Rooted in climatology’s theories and ideas, it stretches to embrace the breadth of the Earth system. The climate is everything, after all, and in order to survive an era when human desire has altered the planet’s geology, this new field of study must encompass humanity itself — and all the rest of the Earthly mess.
Nearly a century ago, the philosopher Alexander Kojéve concluded it was possible for political philosophy to gain a level of absolute knowledge about the world and, second, that it had done so. In the wake of the French Revolution, some fusion of socialism or capitalism would win the day, he concluded, meaning that much of the remaining “work to do” in society lay not in large-scale philosophizing about human nature, but in essentially bureaucratic questions of economic and social governance. So he became a technocrat, and helped design the market entity that later became the European Union.
Is this climate science’s Kojéve era? It just may be — but it won’t last forever, Oppenheimer reminded me.
“Generations in the future will still be dealing with this problem,” he said. “Even if we get off fossil fuels, some future idiot genius will invent some other climate altering substance. We can never put climate aside — it’s part of the responsibility we inherited when we started being clever enough to invent problems like this in future.”
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Bloom Energy is riding the data center wave to new heights.
Fuel cells are back — or at least one company’s are.
Bloom Energy, the longtime standard-bearer of the fuel cell industry, has seen its share of ups and downs before. Following its 2018 IPO, its stock price shot up to over $34 before falling to under $3 a share in October 2019, then soared to over $42 in the COVID-era market euphoria before falling again to under $10 in 2024. Its market capitalization has bounced up and down over the years, from an all time low of less than $1 billion in 2019 and further struggles in early 2020 after it was forced to restate years of earnings thanks to an accounting error after already struggling to be profitable, up again to more than $7 billion in 2021 amidst a surge of interest in backup power.
The stock began soaring (again) in the middle of last year as anything and everything plausibly connected to artificial intelligence was going vertical. Today, Bloom Energy is trading at more than $111 a share, with a market cap north of $26 billion — and that’s after a dramatic fall from its all-time high price of over $135 per share, reached in November. By contrast, Southwest Airlines is worth around $22 billion; Edison International, the parent company of Southern California Edison, is worth about $22.5 billion.
This is all despite Bloom recording regular losses according to generally accepted accounting principles, although its quarterly revenue has risen by over 50%, and its reported non-GAAP and adjusted margins and profits have grown considerably. The company has signed deals or deployed its fuel cells with Oracle, the utility AEP, Amazon Web Services, gas providers, the network infrastructure company Equinix, the real estate developer Brookfield, and the artificial intelligence infrastructure company CoreWeave, Bloom’s chief executive and founder, KR Sridhar, said in its October earnings call.
While fuel cells have been pitched for decades as a way to safely use hydrogen for energy, fuel cells can also run on natural gas or biogas, which the company has seized on as a way to ride the data center boom. Bloom leadership has said that the company will double its manufacturing capacity by the end of this year, which it says will “support” a projected four-fold annual revenue increase. “The AI build-outs and their power demands are making on-site power generated by natural gas a necessity,” Sridhar said during the earnings call.
To get a sense of how euphoric perception of Bloom Energy has been, Morgan Stanley bumped its price target from $44 dollars a share to $85 on September 16 — then just over a month later, bumped it again to $155, calling the company “one of our favorite ‘time to power’ stocks given its available capacity and near-term expansion plans.”
Bloom has also won plaudits from semiconductor and data center industry analysts. The research firm SemiAnalysis described Bloom’s fuel cells as a “a fairly niche solution [that] is now taking an increasingly large share of the pie.”
It’s been a long journey from green tech darling to AI infrastructure for Bloom Energy — and fuel cells as a technology.
Bloom was founded in 2001, originally as Ion America, and quickly attracted high profile Silicon Valley investors. By 2010, fuel cells (and Bloom) were still being pitched as the generation source of the future, with The New York Times reporting in 2010 that Bloom had “spent nearly a decade developing a new variety of solid oxide fuel cell, considered the most efficient but most technologically challenging fuel-cell technology.” That product launch followed some $400 million in funding, and Bloom would hit an almost $3 billion valuation in 2011.
By 2016, however, when the company first filed with the Securities and Exchange Commission to sell shares to the public, it was being described by the Wall Street Journal as “a once-ballyhooed alternative energy startup,” in an article that said the fuel cell industry had been an “elusive target for decades, with a succession of companies unable to realize its business potential.” The company finally went public in 2018 at a valuation of $1.6 billion.
Then came the AI boom.
Fuel cells don’t use combustion to generate power, instead combining oxygen ions with hydrogen from natural gas and generating emissions of carbon dioxide and water, albeit without the particulate pollution of other forms of fossil-fuel-based electricity generation. This makes the process of getting permits from the Environmental Protection Agency “significantly smoother and easier than that of combustion generators,” SemiAnalysis wrote in a report.
In today’s context, Bloom’s fuel cells are yet another on-site, behind-the-meter natural gas power solution for data centers. “The rapid expansion of AI data centers in the U.S. is colliding with grid bottlenecks, driving operators to adopt BTM generation for speed-to-power and resilience to their modularity, fast deployment, and ability to handle volatile AI workloads,” Jefferies analyst Dushyant Ailani wrote in a note to clients. “Natural gas reciprocating engines, Batteries, and Bloom fuel cells are emerging as a preferred solution due to their modularity, fast deployment, and ability to handle volatile AI workloads.”
SemiAnalysis estimates that capital expenditure for Bloom fuel cells are substantially higher than those for gas turbines on a kilowatt-hour basis — $3,000 to $4,000 for fuel cells, compared to between $1,500 and $2,500 for turbines. But where the company excels is in speed. “The big turbines are sold out for four or five years,” Maheep Mandloi, an analyst at Mizuho Securities, told me. “The smaller ones for behind the meter for one to two years. These guys can deliver, if needed, within 90 days.”
Like other data center-related companies, Bloom has faced some local opposition, though not a debilitating amount. In Hilliard, Ohio, the state siting board overrode concerns about the deployment of more than 200 fuel cells at an AWS facility.
Bloom is also far from the only company that has realigned itself to ride the AI wave. Caterpillar, which makes simple turbine systems largely for the oil and gas industry, has become a data center darling, while the major turbine manufacturers Mitsubishi, Siemens Energy, and GE Vernova have all seen dramatic increases in their stock price in the last year. Korean industrial conglomerate Doosan is now developing a new large-scale turbine. Even the supersonic jet startup Boom is developing a gas turbine for data centers.
While artificial intelligence — or at least artificial intelligence companies — promises unforeseen technological and scientific advancements, so far it’s being powered by the technological and scientific advancements of the past.
On AI forecasts, California bills, and Trump’s fusion push
Current conditions: The intense rain pummeling Southern California since the start of the new year has subsided, but not before boosting Los Angeles’ total rainfall for the wet season that started in October a whopping 343% above the historical average • The polar vortex freezing the Great Lakes and Northeast is moving northward, allowing temperatures in Chicago to rise nearly 20 degrees Fahrenheit • The heat wave in southern Australia is set to send temperatures soaring above 113 degrees.

It’s not the kind of thing anyone a decade ago would have imagined: a communique signed by most of Western Europe’s preeminent powers condemning Washington’s efforts to seize territory from a fellow NATO ally. But in the days since the United States launched a surprise raid on Venezuela and arrested its long-time leader Nicolás Maduro, President Donald Trump has stepped up his public lobbying of Denmark to cede sovereignty over Greenland to the U.S. Senator Thom Tillis, the North Carolina Republican, and Senator Jeanne Shaheen, the Democrat from New Hampshire, put out a rare bipartisan statement criticizing the White House’s pressure campaign on Denmark, “one of our oldest and most reliable allies.” While Stephen Miller, Trump’s hard-line deputy chief of staff, declined to rule out an invasion of Greenland during a TV appearance this week, The Wall Street Journal reported Tuesday that Secretary of State Marco Rubio told lawmakers that the goal of the administration’s recent threats against the autonomously-governed Arctic island were to press Denmark into a sale.
The U.S. unsuccessfully tried acquiring Greenland multiple times during the 20th century, and invaded the island during World War II to prevent the Nazis from gaining a North American foothold after Denmark fell in the blitzkrieg. Indeed, Washington purchased the U.S. Virgin Islands, its second largest Caribbean territory, shortly after the 1898 Spanish-American war that brought Puerto Rico under American control. But the national-security logic of taking Greenland now, when the U.S. already maintains a military base there, is difficult to parse. “Greenland already is in the U.S. sphere of influence,” Columbia University political scientist Elizabeth N. Saunders wrote in a post on Bluesky. “It’s far cheaper for the U.S., in material, security, and reputational terms, to have Denmark continue administering Greenland and work within NATO on security.” One potential reason Trump might want the territory, as Heatmap’s Jael Holzman wrote last fall, is to access Greenland’s mineral wealth. But the logistics of getting rare earths out of both the ground and the Arctic to refineries in the U.S. are challenging. Meanwhile, in other imperialistic activities, Trump said Tuesday evening in a post on Truth Social that Venezuela would cede between 30 million and 50 million barrels of oil to the U.S., though the legal mechanism for such a transfer remains murky, according to The New York Times.
I told you last month about the in-house market monitor at the PJM Interconnection, the country’s largest power grid, urging federal regulators to prevent more data centers coming online within its territory until it can sort out how to reliably supply them with electricity. As Heatmap’s Matthew Zeitlin wrote days later, “everyone wants to know PJM’s data center plan.” On Tuesday, E&E News reported that PJM is expected to ratchet down its forecasts for how much power demand artificial intelligence will add on the East Coast. When the grid operator’s latest analysis of future needs comes out later this month, PJM Chief Operating Officer Stu Bresler said during a call last month that the projections for mid-2027 will be “appreciably lower” than the current forecast.
The merger of the parent company of Trump’s TruthSocial website and the nuclear fusion developer TAE Technologies, as I reported in this newsletter last month, is “flabbergasting” to analysts. And yet the pair’s partnership is advancing. On Tuesday, the companies announced that site selection was underway for a pilot-scale power plant set to begin construction later this year. The first facility would generate just 50 megawatts of electricity. But the companies said future plants are expected to pump out as much as 500 megawatts of power.
Meanwhile, the rival startup widely seen as the frontrunner to build America’s first fusion plant unveiled new deals of its own. Over at the CES 2026 electronics show in Las Vegas on Tuesday, Commonwealth Fusion Systems — which analysts say is taking a more simplified and straightforward pathway to commercializing fusion power than TAE — touted a new deal with microchip giant Nvidia and told the crowd at the conference that it had installed the first magnet at its pilot reactor, TechCrunch reported.
Sign up to receive Heatmap AM in your inbox every morning:
Scott Wiener, the California state senator making a bid for Representative Nancy Pelosi’s long-held House seat, introduced two new bills he said were designed to ease rising energy costs. The first bill is meant to “get rid of a bunch of that red tape” that makes installing a heat pump expensive and challenging in the state, the Democrat explained in a video posted on Bluesky. The second piece of legislation would clear the way for renters to install small, plug-in solar panels on apartment balconies. “Right now, in California, it is way, way, way too hard, if not impossible, to install these kinds of units,” Wiener said. “We have to make energy more affordable for people.”
Sunrun is forming a new joint venture with the green infrastructure investor HASI to finance deployment of at least 300 megawatts of solar across what the companies billed as “more than 40,000 home power plants across the country.” As part of the deal, which closed last month, HASI will invest $500 million over an 18-month period into the new company, allowing the nation’s largest solar installer to “retain a significant long-term ownership position” in the projects. As I reported for exclusively Heatmap in October, a recent analysis by the nonprofit Permit Power, which advocates for easing red tape on rooftop solar, found that the cost of solar panels in the U.S. was far higher than in Australia or Germany due to bureaucratic rules. The HASI investment will help bring down the costs for Sunrun directly as it installs more panels.
Total U.S. utility-scale solar installations for 2025 were on track last month to beat the previous year, as I reported in this newsletter. But the phaseout of federal tax credits next year is set to dim the industry somewhat as projects race to start construction before the expiration date.
In another session at CES 2026, the electric transportation company Donut Labs claimed it’s made an affordable, energy-dense solid state battery that’s powering a new motorcycle and charges in just five minutes. The startup hasn’t yet produced any independent verification of those promises. But the company is known for what InsideEVs called its “sci-fi wheel-in electric motor” for its bikes.
Deep Fission says that building small reactors underground is both safer and cheaper. Others have their doubts.
In 1981, two years after the accident at Three Mile Island sent fears over the potential risks of atomic energy skyrocketing, Westinghouse looked into what it would take to build a reactor 2,100 feet underground, insulating its radioactive material in an envelope of dirt. The United States’ leading reactor developer wasn’t responsible for the plant that partially melted down in Pennsylvania, but the company was grappling with new regulations that came as a result of the incident. The concept went nowhere.
More than a decade later, the esteemed nuclear physicist Edward Teller resurfaced the idea in a 1995 paper that once again attracted little actual interest from the industry — that is, until 2006, when Lowell Wood, a physicist at the Lawrence Livermore National Laboratory, proposed building an underground reactor to Bill Gates, who considered but ultimately abandoned the design at his nuclear startup, TerraPower.
Now, at last, one company is working to make buried reactors a reality.
Deep Fission proposes digging boreholes 30 inches in diameter and about a mile deep to house each of its 15-megawatt reactors. And it’s making progress. In August, the Department of Energy selected Deep Fission as one of the 10 companies enrolled in the agency’s new reactor pilot program, meant to help next-generation startups split their first atoms by July. In September, the company announced a $30 million reverse merger deal with a blank check firm to make its stock market debut on the lesser-known exchange OTCQB. Last month, Deep Fission chose an industrial park in a rural stretch of southeastern Kansas as the site of its first power plant.
Based in Berkeley, California, the one-time hub of the West Coast’s fading anti-nuclear movement, the company says its design is meant to save money on above-ground infrastructure by letting geology do the work to add “layers of natural containment” to “enhance safety.” By eliminating much of that expensive concrete and steel dome that encases the reactor on the surface, the startup estimates “that our approach removes up to 80% of the construction cost, one of the biggest barriers for nuclear, and enables operation within six months of breaking ground.”
“The primary benefit of placing a reactor a mile deep is cost and speed,” Chloe Frader, Deep Fission’s vice president of strategic affairs, told me. “By using the natural pressure and containment of the Earth, we eliminate the need for the massive, above-ground structures that make traditional nuclear expensive and slow to build.”
“Nuclear power is already the safest energy source in the world. Period,” she said. “Our underground design doesn’t exist because nuclear is unsafe, it exists because we can make something that is already extremely safe even safer, simpler, and more affordable.”
But gaining government recognition, going public, and picking a location for a first power plant may prove the easy part. Convincing others in the industry that its concept is a radical plan to cut construction costs rather than allay the public’s often-outsize fear of a meltdown has turned out to be difficult, to say nothing of what actually building its reactors will entail.
Despite the company’s recent progress, I struggled to find anyone who didn’t have a financial stake in Deep Fission willing to make the case for its buried reactors.
Deep Fission is “solving a problem that doesn't actually exist,” Seth Grae, the chief executive of the nuclear fuel company Lightbridge, told me. In the nearly seven decades since fission started producing commercial electrons on the U.S. grid, no confirmed death has ever come from radiation at a nuclear power station.
“You’re trying to solve a political problem that has literally never hurt anyone in the entire history of our country since this industry started,” he said. “You’re also making your reactors more expensive. In nuclear, as in a lot of other projects, when you build tall or dig deep or lift big and heavy, those steps make the projects much more expensive.”
Frader told me that subterranean rock structures would serve “as natural containment, which also enhances safety.” That’s true to some extent. Making use of existing formations “could simplify surface infrastructure and streamline construction,” Leslie Dewan, a nuclear engineer who previously led a next-generation small modular reactor startup, told IEEE Spectrum.
If everything pans out, that could justify Deep Fission’s estimate that its levelized cost of electricity — not the most dependable metric, but one frequently used by solar and wind advocates — would be between $50 and $70 per megawatt-hour, lower than other SMR developers’ projections. But that’s only if a lot of things go right.
“A design that relies on the surrounding geology for safety and containment needs to demonstrate a deep understanding of subsurface behavior, including the stability of the rock formations, groundwater movement, heat transfer, and long-term site stability,” Dewan said. “There are also operational considerations around monitoring, access, and decommissioning. But none of these are necessarily showstoppers: They’re all areas that can be addressed through rigorous engineering and thoughtful planning.”
As anyone in the geothermal industry can tell you, digging a borehole costs a lot of money. Drilling equipment comes at a high price. Underground geology complicates a route going down one mile straight. And not every hole that’s started ends up panning out, meaning the process must be repeated over and over again.
For Deep Fission, drilling lots of holes is part of the process. Given the size of its reactor, to reach a gigawatt — the output of one of Westinghouse’s flagship AP1000s, the only new type of commercial reactor successfully built from scratch in the U.S. this century — Deep Fission would need to build 67 of its own microreactors. That’s a lot of digging, considering that the diameters of the company’s boreholes are on average nearly three times wider than those drilled for harvesting natural gas or geothermal.
The company isn’t just distinguished by its unique approach. Deep Fission has a sister company, Deep Isolation, that proposes burying spent nuclear fuel in boreholes. In April, the two startups officially partnered in a deal that “enables Deep Fission to offer an end-to-end solution that includes both energy generation and long-term waste management.”
In theory, that combination could offer the company a greater social license among environmental skeptics who take issue with the waste generated from a nuclear plant.
In 1982, Congress passed a landmark law making the federal government responsible for the disposal of all spent fuel and high-level radioactive waste in the country. The plan centered on building a giant repository to permanently entomb the material where it could remain undisturbed for thousands of years. The law designated Yucca Mountain, a rural site in southwestern Nevada near the California border, as the exclusive location for the debut repository.
Construction took years to start. After initial work got underway during the Bush administration, Obama took office and promptly slashed all funding for the effort, which was opposed by then-Senate Majority Leader Harry Reid of Nevada; the nonpartisan Government Accountability Office clocked the move as a purely political decision. Regardless of the motivation, the cancellation threw the U.S. waste disposal strategy into limbo because the law requires the federal government to complete Yucca Mountain before moving on to other potential storage sites. Until that law changes, the U.S. effort to find a permanent solution to nuclear waste remains in limbo, with virtually all the spent fuel accumulated over the years kept in intermediate storage vessels on site at power plants.
Finland finished work on the world’s first such repository in 2024. Sweden and Canada are considering similar facilities. But in the U.S., the industry is moving beyond seeing its spent fuel as waste, as more companies look to start up a recycling industry akin to those in Russia, Japan, and France to reprocess old uranium into new pellets for new reactors. President Donald Trump has backed the effort. The energy still stored in nuclear waste just in this country is sufficient to power the U.S. for more than a century.
Even if Americans want an answer to the nuclear waste problem, there isn’t much evidence to suggest they want to see the material stored near their homes. New Mexico, for example, passed a law barring construction of an intermediate storage site in 2023. Texas attempted to do the same, but the Supreme Court found the state’s legislation to be in violation of the federal jurisdiction over waste.
While Deep Fission’s reactors would be “so far removed from the biosphere” that the company seems to think the NRC will just “hand out licenses and the public won’t worry,” said Nick Touran, a veteran engineer whose consultancy, What Is Nuclear, catalogs reactor designs and documents from the industry’s history, “the assumption that it’ll be easy and cheap to site and license this kind of facility is going to be found to be mistaken,” he told me.
The problem with nuclear power isn’t the technology, Brett Rampal, a nuclear expert at the consultancy Veriten, told me. “Nuclear has not been suffering from a technological issue. The technology works great. People do amazing things with it, from curing cancer to all kinds of almost magical energy production,” he told me. “What we need is business models and deployment models.”
Digging a 30-inch borehole a mile deep would be expensive enough, but Rampal also pointed out that lining those shafts with nuclear-grade steel and equipping them with cables would likely pencil out to a higher price than building an AP1000 — but with one one-hundredth of the power output.
Deep Fission insists that isn’t the case, and that the natural geology “removes the need for complex, costly pressure vessels and large engineered structures” on the surface.
“We still use steel and engineered components where necessary, but the total material requirements are a fraction of those used in a traditional large-scale plant,” Frader said.
Ultimately, burying reactors is about quieting concerns that should be debunked head on, Emmet Penney, a historian of the industry and a senior fellow at the Foundation for American Innovation, a right-leaning think tank that advocates building more reactors in the U.S., told me.
“Investors need to wake up and realize that nuclear is one of the safest power sources on the planet,” Penney said. “Otherwise, goofy companies will continue to snow them with slick slide decks about solving non-issues.”