You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
They can be an effective wildfire prevention tool — but not always.

Once the fires stop burning in Los Angeles and the city picks itself up from the rubble, the chorus of voices asking how such a disaster could have been prevented will rise. In California, the answer to that desperate query is so often “better forestry management practices,” and in particular “more controlled burns.” But that’s not always the full story, and in the case of the historically destructive L.A. fires, many experts doubt that prescribed burns and better vegetation management would have mattered much at all.
Controlled burns are intentionally set and supervised by land managers to clear out excess fuels such as shrubs, trees, and logs to reduce wildfire risk. Many habitats also require fire to thrive, and so ensuring they burn in a controlled manner is a win-win for natural ecosystems and the man-made environment. But controlled burns also pose a series of challenges. For one, complex permitting processes and restrictions around when and where burns are allowed can deter agencies from attempting them. Community backlash is also an issue, as residents are often concerned about air quality as well as the possibility of the prescribed fires spiraling out of control. Land management agencies also worry about the liability risks of a controlled burn getting out of hand.
Many of the state’s largest and most destructive fires — including the Camp Fire in 2018, lightning complex fires in 2020, and Dixie Fire in 2021 — started in forests, and would therefore have likely been severely curtailed had the state done more controlled burns. According to ProPublica, anywhere between 4.4 million and 11.8 million acres used to burn annually in prehistoric California. By 2017, overzealous fire suppression efforts driven by regulatory barriers and short-term risk aversion had caused that number to drop to 13,000 acres. While the state has increased the amount of prescribed fire in recent years, the backlog of fuel is enormous.
But the L.A. fires didn’t start or spread in a forest. The largest blaze, in the Pacific Palisades neighborhood, ignited in a chaparral environment full of shrubs that have been growing for about 50 years. Jon Keeley, a research scientist with the U.S. Geological Survey and an adjunct professor at the University of California, Los Angeles, said that’s not enough time for this particular environment to build up an “unnatural accumulation of fuels.”
“That’s well within the historical fire frequency for that landscape,” Keeley told my colleague, Emily Pontecorvo, for her reporting on what started the fires. Generally, he said, these chaparral environments should burn every 30 to 130 years, with coastal areas like Pacific Palisades falling on the longer end of that spectrum. “Fuels are not really the issue in these big fires — it’s the extreme winds. You can do prescription burning in chaparral and have essentially no impact on Santa Ana wind-driven fires.”
Get the best of Heatmap in your inbox daily.
We still don’t know what ignited the L.A. fires, and thus whether a human, utility, or other mysterious source is to blame. But the combination of factors that led to the blazes — wet periods that allowed for abundant vegetation growth followed by drought and intensely powerful winds — are simply a perilously bad combination. Firebreaks, strips of land where vegetation is reduced or removed, can often prove helpful, and they do exist in the L.A. hillsides. But as Matthew Hurteau, a professor at the University of New Mexico and director of the Center for Fire Resilient Ecosystems and Society, told me bluntly, “When you have 100-mile-an-hour winds pushing fire, there’s not a hell of a lot that’s going to stop it.”
Hurteau told me that he thinks of the primary drivers of destructive fires as a triangle, with fuels, climate, and the built environment representing the three points. “We’re definitely on the built environment, climate side of that triangle for these particular fires around Los Angeles,” Hurteau explained, meaning that the wildland-urban interface combined with drought and winds are the primary culprits. But in more heavily forested, mountainous areas of Northern California, “you get the climate and fuels side of the triangle,” Hurteau said.
Embers can travel impressive distances in the wind, as evidenced by footage of past fires jumping expansive freeways in Southern California. So, as Hurteau put it, “short of mowing whole hillsides down to nothing and keeping them that way,” there’s little vegetation management work to be done at the wildland-urban interface, where houses bump up against undeveloped lands.
Not everyone agrees, though. When I spoke to Susan Prichard, a fire ecologist and research scientist at the University of Washington School of Environmental and Forest Sciences, she told me that while prescribed burns close to suburban areas can be contentious and challenging, citizens can do a lot on their own to manage fuel risk. “Neighborhoods can come together and do the appropriate fuel reduction in and around their homes, and that makes a huge difference in wildfires,” she told me. “Landscaping in and around homes matters, even if you have 100-mile-an-hour winds with a lot of embers.”
Prichard recommends residents work with their neighbors to remove burnable vegetation and organic waste, and to get rid of so-called “ember traps” such as double fencing that can route fires straight to homes. Prichard pointed to research by Crystal Kolden, a “pyrogeographer” and associate professor at the University of California Merced, whose work focuses on understanding wildfire intersections with the human environment. Kolden has argued that proper vegetation management could have greatly lessened the impact of the L.A. fires. As she recently wrote on Bluesky, “These places will see fire again. I have no doubt. But I also know that you can rebuild and manage the land so that next time the houses won’t burn down. I’ve seen it work.”
Keeley pointed to the 2017 Thomas Fire in Ventura and Santa Barbara Counties, however, as an example of the futility of firebreaks and prescribed burns in extreme situations. That fire also ignited outside of what’s normally considered fire season, in December. “There were thousands of acres that had been prescribed burned near the eastern edge of that fire perimeter in the decade prior to ignition,” Keeley explained to Emily. “Once that fire was ignited, the winds were so powerful it just blew the embers right across the prescribed burn area and resulted in one of the largest wildfires that we’ve had in Southern California.”
Kolden, however, reads the Thomas Fire as a more optimistic story. As she wrote in a case report on the fire published in 2019, “Despite the extreme wind conditions and interviewee estimates of potentially hundreds of homes being consumed, only seven primary residences were destroyed by the Thomas Fire, and firefighters indicated that pre-fire mitigation activities played a clear, central role in the outcomes observed.” While the paper didn’t focus on controlled burns, mitigation activities discussed include reducing vegetation around homes and roads, as well as common-sense actions such as increasing community planning and preparedness, public education around fire safety, and arguably most importantly, adopting and enforcing fire-resistant building codes.
So while blaming decades of forestry mismanagement for major fires is frequently accurate, in Southern California the villains in this narrative can be trickier to pin down. Is it the fault of the winds? The droughts? The humans who want to live in beautiful but acutely fire-prone areas? The planning agencies that allow people to fulfill those risky dreams?
Prichard still maintains that counties and the state government can be doing a whole lot more to encourage fuel reduction. “That might not be prescribed burning, that might actually be ongoing mastication of some of the really big chaparral, so that it’s not possible for really tall, developed, even senescent vegetation — meaning having a lot of dead material in it — to burn that big right next to homes.”
From Hurteau’s perspective though, far and away the most effective solution would be simply building structures to be much more fire-resilient than they are today. “Society has chosen to build into a very flammable environment,” Hurteau put it. California’s population has increased over 160% since the 1950’s, far outpacing the country overall and pushing development further and further out into areas that border forests, chaparral, and grasslands. “As people rebuild after what’s going to be great tragedy, how do you re-envision the built environment so that this becomes less likely to occur in the future?”
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
On China’s H2 breakthrough, vehicle-to-grid charging, and USA Rare Earth goes to Brazil
Current conditions: In the Atlantic, Tropical Storm Fernand is heading northward toward Bermuda • In the Pacific, Tropic Storm Juliette is active about 520 miles southwest of Baja California, with winds of up to 65 miles per hour • Temperatures are surging past 100 degrees Fahrenheit in South Korea.
Nearly two weeks ago, Vineyard Wind sued one of its suppliers, GE Vernova, to keep the industrial giant from exiting the offshore wind project off the coast of Nantucket in Massachusetts. Now a U.S. court has ordered GE Vernova to finish the job, saying it would be “fanciful” to imagine a new contractor could complete the installation. GE Vernova had argued that Vineyard Wind — a 50/50 joint venture between the European power giant Avangrid and Copenhagen Infrastructure Partners — owed it $300 million for work already performed. But Vineyard Wind countered that the manufacturer remains on the hook for about $545 million to make up for a catastrophic turbine blade collapse in 2024, according to WBUR. “The project is at a critical phase and the loss of [Vineyard Wind]’s principal contractor would set the project back immeasurably,” the Suffolk County Superior Court Judge Peter Krupp wrote in his decision, repeatedly using the name of GE Vernova’s renewables subsidiary. “To pretend that [Vineyard Wind] could go out and hire one or more contractors to finish the installation and troubleshoot and modify [GE Renewables’] proprietary design without [GE Renewables’] specialized knowledge is fanciful.”
Charlotte DeWald fears the world is sleepwalking into tipping points beyond which the Earth’s natural carbon cycles will render climate change uncontrollable. By the time we realize what it means for global weather and agricultural systems that there’s no sea ice in the Arctic sometime in the 2030s, for example, it may be too late to try anything drastic to buy us more time. Much of the discourse around what to do concerns a specific kind of geoengineering called stratospheric aerosol injections, essentially spraying reflective particles into the sky to block the sun’s heat from permeating the increasingly thick layer of greenhouse gases that prevent that energy from naturally radiating back into space. That’s something DeWald, a former Pacific Northwest National Laboratory researcher and climate scientist by training who specialized in modeling aerosol-cloud interactions, knows all about. But her approach is different, using a technology known as mixed-phase cloud thinning, a process similar to cloud seeding. “The idea is that you could dissipate clouds over the Arctic to release heat from the surface to, for example, increase sea ice extent or thickness or integrity,” she told me. “There’s some early modeling that suggests that it could yield significant cooling over the Arctic Ocean.”
With all that context, you can now appreciate the exclusive bit of news I have for you this morning: DeWald is launching a new nonprofit called the Arctic Stabilization Initiative to “evaluate whether targeted interventions can slow dangerous” warming near the Earth’s northern pole. So far, ASI has raised $6.5 million in philanthropic funding toward a five-year budget goal of $55 million to study whether MCT, as mixed-phase cloud thinning is known, could help save the Arctic. The nonprofit has an advisory board stacked with veteran Arctic scientists and put together a “stage-gated” research plan with offramps in case early modeling suggests MCT won’t work or could cause undue environmental damage. The project also has an eye toward engaging with Indigenous peoples and “will ground all future work in respect for Indigenous sovereignty, before any field-based research activity is pursued.” The statement harkens to Harvard University’s SCoPEx trial, a would-be outdoor experiment in spraying reflective aerosols into the atmosphere over Sweden that ran aground after researchers initially failed to consult local stakeholders and a body representing the Indigenous Saami people in the northern reaches of Nordic nations came out against the testing. (By repeatedly invoking ASI’s nonprofit status, DeWald also seemed to draw a contrast with for-profit stratospheric aerosol injection startup Stardust Solutions, which last year Heatmap’s Robinson Meyer reported had raised $60 million.) “We are continuing to move toward critical planetary thresholds without a bible plan for things like tipping points,” DeWald said. “That was the inflection point for me.”

China just took yet another step closer to energy independence, despite its relatively tiny domestic reserves of oil and gas, kicking off the world’s largest project to blend hydrogen into the natural gas system. As part of the experiment, roughly 100,000 households in the center of the Weifang, a prefecture-level city in eastern Shandong province between Beijing and Shanghai, will receive a blend of up to 10% hydrogen through existing gas pipes. The pilot’s size alone “smashes” the world record, according to Hydrogen Insight. Whether that’s meaningful from a climate perspective depends on how you look at things. A fraction of 1% of China’s hydrogen fuel comes from electrolyzer plants powered by clean renewables or nuclear electricity. But the People’s Republic still produces more green hydrogen than any other nation. Last year, the central government made cleaning up heavy industry with green hydrogen a higher priority — a goal that’s been supercharged by the war in Iran. Therein lies the real biggest motivator now. While China relies on imports for natural gas, swapping out more of that fuel for domestically generated hydrogen allows Beijing to claim the moral high ground on emissions and air pollution — all while becoming more energy independent.
Meanwhile, China’s container ships are the latest sector to experiment with going electric and forgoing the need for costly, dirty bunker fuel. A 10,000-ton fully electric cargo vessel capable of carrying 742 shipping containers just started up operations in China this week, according to a video posted on X by China’s Xinhua News service.
Sign up to receive Heatmap AM in your inbox every morning:
The ability of electric vehicles to serve as distributed energy resources, charging in times of low demand and discharging back onto the grid when demand peaks, has long been a dream of EV enthusiasts and DER advocates alike. California’s PG&E utility launched a small bi-directional charging program in 2023, allowing owners of Ford F-150 Lightnings to use their trucks as home backup power, and eventually feed energy back onto the grid. The utility added a host of General Motors EVs to the program back in 2025. On Monday, it announced its latest vehicle participant: Tesla’s Cybertruck. The Tesla vehicle will be the first in the program to run on alternating current, which simplifies the equipment necessary and lowers costs for consumers, according to PG&E’s announcement.
In January, I told you about the then-latest company to benefit from President Donald Trump’s dabbling in what you might call state capitalism with American characteristics: USA Rare Earth. The vertically integrated company, which aims to mine rare earths in Texas, took big leaps forward in the past year toward building factories to turn those metals into the magnets needed for modern technologies. For now, however, the company needs ore. On Monday, USA Rare Earth announced plans to buy Brazilian rare earth miner Serra Verde in a deal valued at $2.8 billion in cash and shares. The transaction is expected to be complete by the end of the third quarter of this year. The company pitched the move as a direct challenge to China, which dominates both the processing of rare earths mined at home and abroad. “The world has become too dependent on a single source and it’s high time to break that dependency,” USA Rare Earth CEO Barbara Humpton told CNBC’s “Squawk Box” on Monday.
As if we needed more evidence that the data center backlash is “swallowing American politics,” here’s Heatmap’s Jael Holzman with yet another data point: According to tracking from the Heatmap Pro database, fights against data centers now outnumber fights against wind farms in the U.S. That includes both onshore and offshore wind developments. “Taken together,” Jael wrote, “these numbers describe the tremendous power involved in the data center wars.”
Fights over AI-related developments outnumber those over wind farms in the Heatmap Pro database.
Local data center conflicts in the U.S. now outnumber clashes over wind farms.
More than 270 data centers have faced opposition across the country compared to 258 onshore and offshore wind projects, according to a review of data collected by Heatmap Pro. Data center battles only recently overtook wind turbines, driven by the sudden spike in backlash to data center development over the past year. It’s indicative of how the intensity of the angst over big tech infrastructure is surging past current and historic malaise against wind.
Battles over solar projects have still occurred far more often than fights over data centers — nearly twice as many times, per the data. But in terms of megawatts, the sheer amount of data center demand that has been opposed nearly equals that of solar: more than 51 gigawatts.
Taken together, these numbers describe the tremendous power involved in the data center wars, which is now comparable to the entire national fight over renewable energy. One side of the brawl is demand, the other supply. If this trend continues at this pace, it’s possible the scale of tension over data centers could one day usurp what we’ve been tracking for both solar and wind combined.
The enhanced geothermal darling is spending big on capex, but its shares will be structured more like a software company’s.
Fervo, the enhanced geothermal company that uses hydraulic fracturing techniques to drill thousands of feet into the Earth to find pockets of heat to tap for geothermal power, is going public.
The Houston-based company was founded in 2017 and has been a longtime favorite of investors, government officials, and the media (not to mention Heatmap’s hand-selected group of climate tech insiders) for its promise of producing 24/7 clean power using tools, techniques, and personnel borrowed from the oil and gas industry.
After much speculation as to when it would go public, Fervo filed the registration document for its initial public offering on Friday evening. Here’s what we were able to glean about the company, its business, and the geothermal industry from the filing.
The main theme of the document, known as an S-1, is the immense potential enhanced geothermal — and, thus, Fervo — has.
The company says that its Cape Station site in Utah, where it’s currently developing its flagship power plants, had “4.3 gigawatts of capacity potential” alone. That’s more than the 3.8 gigawatts of conventional geothermal capacity currently on the grid. Enhanced geothermal technology, otherwise known as EGS, “has the potential to make geothermal generation as ubiquitous as solar generation is in the U.S. today,” the company projects. (There’s about 280 gigawatts of installed solar capacity currently in the U.S., according to the Solar Energy Industries Association) “A broader subset of our reviewed leases represents over 40 gigawatts” of capacity, the document goes on.
Like all investor pitches, the S-1 features some eye-popping “total addressable market” figures. Citing analysis by the consulting firm Rystad, the document says that if there’s a sufficient shortfall in capacity due to retiring power plants (98 gigawatts by 2035), the annual market for enhanced geothermal would be approximately $70 billion by 2035, and that this would represent some $2.1 trillion in revenue potential over 30 years.
The company is already producing 3 megawatts at its Nevada Project Red site for the Nevada grid as part of a deal with Google. It also expects to begin generating power from the Cape Station site “by late 2026,” according to the filing, and get up to 100 megawatts “by early 2027.” In total, Fervo has “658 megawatts of binding power purchase agreements,” which it says represents ”approximately $7.2 billion in potential revenue backlog.”
Beyond that, Fervo says it has 2.6 gigawatts “in advanced development,” and “over 38 gigawatts” in “early-stage development,” where it’s still doing feasibility studies to “validate and confirm the path toward commercial development.”
Fervo says that the energy produced from its Cape Station facility will come in at around $7,000 per kilowatt. That’s already cheaper than “traditional and small modular nuclear power,” which the Department of Energy has estimated costs $6,000 to $10,000 per kilowatt, the filing says. Fervo is aiming to get the total project costs down to $3,000 per kilowatt, at which point it says it would outcompete natural gas without any of the price volatility due to fuel costs going up and down.
But Fervo’s upfront spending is still immense. Fervo says that it expects some $1.2 billion in capital expenditure this year, of which only $125 million is going toward the first phase of its Cape Station project, which it has said would deliver 100 megawatts of power. (Meanwhile, the $940 million it expects to spend on the second phase, which is due to be 400 megawatts, is mostly unfunded.) The company says the public offering will fund “project-level capital expenditures,” as well as land holdings and general corporate expenditures.
Google comes up some 36 times in the document, most times in reference to the “Geothermal Framework Agreement” Fervo signed with the hyperscaler this past March. The S-1 describes the deal as a “3-gigawatt framework agreement … to advance and structure potential power offtake opportunities for current and planned data centers in both grid-connected and alternative energy solutions.” This deal, the company says, “establishes a structured process for the development of geothermal projects across specified regions of the United States,” and could involve the offtake by Google of up to 3 gigawatts of Fervo-generated electricity by the end of 2033.
What the framework is not is a power purchase agreement. One of the risk factors Fervo lists in the IPO document says, “The GFA is a non-binding agreement, and does not obligate Google to purchase power from us.” Instead, it is “a binding framework under which we may propose geothermal development projects to Google, but it does not obligate Google to accept any project, execute any power purchase agreement or provide us with any project financing.”
The agreement also places limits on Fervo, including from whom it can accept investment or financing. (The deal outlines a “broad category of entities defined as competitors,” which are all no-nos.) Overall, the company says, the arrangement gives Google “significant priority over our near-term development pipeline and may limit our flexibility to pursue alternative commercial, strategic, or financing arrangements that would otherwise be available to us.”
Upon going public, the company will have two shares of stock: Class A shares available to the public, and Class B shares owned by its founders, chief executive officer Tim Latimer, and chief technology officer Jack Norbeck. These Class B shares will have 40 times the voting rights of the class A shares and will allow Latimer and Norbeck to “collectively continue to control a significant percentage of the combined voting power of our common stock and therefore are able to control all matters submitted to our stockholders for approval.”
These arrangements are familiar with venture-backed, founder-led software companies. Alphabet and Meta are the most prominent examples of large, publicly traded companies that are under the effective control of their founders thanks to dual class share structures. Tesla, rather famously, does not have a dual class share structure, which is why CEO Elon Musk convinced his board to award him more shares so that he would maintain a high degree of influence over the company.
While other technology companies such as Stripe pile up billions in revenue without any near term prospects of going public, Fervo largely has spending to report on its income statement.
In 2025, the company reported just $138,000 in revenues with a $58 million net loss; that’s compared to a $41 million net loss in 2024. The revenues were “ancillary fees associated with rights to geothermal production at Project Red,” the company said. “This type of revenue is not expected to be significant to our long-term revenue generation, as we have not yet commenced large-scale commercial operations.”
And there’s more spending to come.
Fervo expects that the second phase of its Cape Station project will “require approximately $2.2 billion in capital expenditures through 2028,” which it hopes to pay for with project-level financing.
Fervo said it is “continuing to evaluate the effect of the OBBB” — that is, the One Big Beautiful Bill Act, which slashed or curtailed tax credits for clean energy companies — and that it wasn’t able to “reasonably” estimate the effect on its financial statements by the end of last year. The company does say, however, that it “may benefit from ITCs and PTCs (including the energy community and domestic content bonuses available under the ITC and PTC, in certain circumstances) with respect to qualifying renewable energy projects,” referring to the investment and production tax credits, which acquired a strict set of eligibility rules under OBBBA. It cautioned that the current guidance regarding tax credit eligibility is “subject to a number of uncertainties,” and that “there can be no assurance that the IRS will agree with our approach to determining eligibility for ITCs and PTCs in the event of an audit.”
The company also disclosed that earlier this month, it reached a deal with Liberty Mutual, the insurance company “to sell and transfer tax credits generated at Cape Station Phase I,” taking advantage of a provision of the law that allows credits to be sold to other entities with tax liability, and not just harvested by investors in the project.