Sign In or Create an Account.

By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy

Adaptation

This New Wildfire Risk Model Has No Secrets

CarbonPlan has a new tool to measure climate risk that comes with full transparency.

A house and flames.
Heatmap Illustration/Getty Images

On a warming planet, knowing whether the home you’re about to invest your life savings in is at risk of being wiped out by a wildfire or drowned in a flood becomes paramount. And yet public data is almost nonexistent. While private companies offer property-level climate risk assessments — usually for a fee — it’s hard to know which to trust or how they should be used. Companies feed different datasets into their models and make different assumptions, and often don’t share all the details. The models have been shown to predict disparate outcomes for the same locations.

For a measure of the gap between where climate risk models are and where consumers want them to be, look no further than Zillow. The real estate website added a “climate risk” section to its property listings in 2024 in response to customer demand only to axe the feature a year later at the behest of an industry group that questioned the accuracy of its risk ratings.

Now, however, a new tool that assesses wildfire risk for every building in the United States aims to advance the field through total transparency. The nonprofit research group CarbonPlan launched the free, user-friendly app called Open Climate Risk on Tuesday. It allows anyone to enter an address and view a wildfire risk score, on a scale of zero to 10, along with an explanation of how it was calculated. The underlying methodology, data, and code are all public. It’s the first fully open platform of its kind, according to CarbonPlan.

“Right now, the way science works in the climate risk space is that every model is independently developed at different companies, and we essentially have no idea what’s happening in them. We have no idea if they’re any good,” Oriana Chegwidden, a research scientist at CarbonPlan who led the creation of the tool, told me. “Our hope is that by opening this up, people will be able to start contributing, to help us learn how we can do it better.” That might mean critiquing CarbonPlan’s methods or code, for example, or re-running the model with additional data.

The score itself doesn’t tell you much other than the relative risk between one building and another. But the platform also breaks out the two inputs behind it: burn probability, or the likelihood a building will catch fire in a given year, and “conditional risk,” an estimate of how much of the building’s value would be lost if it does burn, based on projected fire intensity.

The projections are largely based on a U.S. Forest Service dataset that models fire frequency on wildlands throughout the country. CarbonPlan uses additional data on wind speed and direction to predict how a given fire might spread into an urban area.

Users can toggle between risk under the “current” climate and a “future” climate, which jumps about 20 years out. They can also see the distribution of buildings across the spectrum of risk scores at various geographic scales — by state, county, census tract, or census block.

One of CarbonPlan’s hopes is to help people become more informed consumers of climate risk data by helping them understand how it’s put together and what questions they might want to ask. While its model is more crude than others on the market, the tool is explicit about the factors that are not accounted for in the results. The loss estimates are based on a generic building, for example, and do not recognize specific traits like fire-resistant construction materials or landscaping that could make a home more fire resistant. They also don’t consider building-to-building spread. The underlying U.S. Forest Service data is also limited in that it maps vegetation across the country as it existed at the end of 2020 — any changes since then that could have reduced fire-igniting fuels, such as prescribed burns, are not incorporated.

Right now, there’s no industry standard for calculating or communicating climate risk. The Global Association of Risk Professionals recently asked 13 climate risk companies for data on floods, tropical storms, wildfires, and heat at 100 addresses to compare the outputs. The authors found there were “significant disparities,” between estimates of vulnerability and damages at the same locations. When it came to wildfires, specifically, they were unable to even compare the data, because the companies all conveyed the risk using different benchmarks.

The implications of having so many diverging methods and results extend beyond individual homebuying decisions. Insurance companies use climate risk data to set rates; publicly-traded companies use it to make disclosures to investors; policymakers use it to guide community planning and investments in adaptation. Some products might be better suited to one task or another.

Katherine Mach, an environmental science and policy professor at the University of Miami, told me the next step for the field is to have more systematic reporting requirements that help people understand how accurate the data are and what types of decisions they can be used for.

“It’s almost like we need the equivalent of industry standards,” she said. “You’re going to release a climate product? Here’s what you need to clearly communicate.”

CarbonPlan collected feedback from various likely users of the tool throughout the development process, including municipal planners, climate scientists, and consumer advocates. The group also hopes to foster an “iterative cycle of community-driven model development,” spurring other researchers to inspect the data, critique it, add to it, and spin out new versions. This is common practice in other areas of climate science, like Earth system modeling and economic modeling, and has been instrumental in advancing those fields. “There’s nothing like that for climate risk right now,” Chegwidden said.

The first step will be raising more money to support further work, but the goal is to partner with outside researchers on comparative analyses and case studies. Tracy Aquino Anderson, CarbonPlan’s interim executive director, told me they have already heard from one researcher who has a fire risk dataset that could be added to the platform. The group has also been invited to present the platform to two academic climate research groups later this spring.

The problem of black box models exists not just because the field is full of private companies that don’t want to share their code. A study published earlier this month found that only 4% of the most-cited peer-reviewed climate risk studies have made their data and code public, despite journal standards that require transparency.

“When you’re working with climate data, you’re dealing with all of these uncertainties,” Adam Pollack, an assistant professor at the University of Iowa who researches flood risk and the lead author of the paper, told me. “Researchers don’t always understand all of the assumptions that are implicit in choices that they make. That’s fine — we have methods for dealing with that. We do model intercomparisons, we do these synthesis studies as a field. The foundation of that is openness and reusability.”

Though he was not involved in the CarbonPlan project, he said it was exactly what his paper was calling for. For example, CarbonPlan’s “future” calculations are based on an extreme warming scenario that has become controversial among climate scientists. CarbonPlan didn’t choose this scenario — it’s what the Forest Service’s dataset used, and that was the only off-the-shelf data available for the entire United States. But because the underlying code is open-source, critics are free to swap it out for other data they may have access to.

“That’s what’s so great about this,” Pollack said. “People who have different values, assumptions, and expertise, can get new estimates and build a shared understanding.”

Yellow

You’re out of free articles.

Subscribe today to experience Heatmap’s expert analysis 
of climate change, clean energy, and sustainability.
To continue reading
Create a free account or sign in to unlock more free articles.
or
Please enter an email address
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Climate

Does Microsoft’s Clean Energy Pullback Actually Matter?

Giving up on hourly matching by 2030 doesn’t mean giving up on climate ambition — necessarily.

Clean energy and the Microsoft logo.
Heatmap Illustration/Getty Images

Microsoft celebrated a “milestone achievement” earlier this year, when it announced that it had successfully matched 100% of its 2025 electricity usage with renewable energy. This past week, however, Bloomberg reported that the company was considering delaying or abandoning its next clean energy target set for 2030.

What comes after achieving 100% renewable energy, you might ask? What Microsoft did in 2025 was tally its annual energy consumption and purchase an equal amount of solar and wind power. By 2030, the company aspired to match every kilowatt it consumes with carbon-free electricity hour by hour. That means finding clean power for all the hours when the sun isn’t shining and the wind isn’t blowing.

Keep reading...Show less
Blue
Energy

Regulatory Reform Is Headed for the Nation’s Largest Grid

PJM Interconnection has some ideas, as does the state of New Jersey.

Josh Shapiro and Mikie Sherrill.
Heatmap Illustration/Getty Images

We’ve already talked this week about Pennsylvania asking whether the modern “regulatory compact,” which grants utilities monopoly geographical franchises and regulated returns from their capital investments, is still suitable in this era of rising prices and data-center-driven load growth.

Now America’s biggest electricity market and another one of that market’s biggest states are considering far-reaching, fundamental reforms that could alter how electricity infrastructure is planned and paid for over 65 million Americans.

Keep reading...Show less
Green
Climate Tech

Funding Friday: Robots Want Fast-Charging Batteries

Big fundraises for Nyobolt and Skeleton Technologies, plus more of the week’s biggest money moves.

A Skeleton factory.
Heatmap Illustration/Getty Images, Skeleton

Following a quiet week for new deals, the industry is back at it with a bunch of capital flowing into some of the industry’s most active areas. My colleague Alexander C. Kaufman already told you about one of the more buzzworthy announcements from data center-land in Wednesday’s AM newsletter: Wave energy startup Panthalassa raised $140 million in a round led by Peter Thiel to “perform AI inference computing at sea” using nodes powered by the ocean’s waves.

This week also saw fresh funding for more conventional data center infrastructure, as Nyobolt and Skeleton Technologies both announced later-stage rounds for data center backup power solutions. Meanwhile, it turns out Redwood Materials is not the only company bringing in significant capital for second-life EV battery systems — Moment Energy just raised $40 million to pursue a similar approach. Elsewhere, investors backed an effort to rebuild domestic magnesium production, and, in a glimmer of hope for a sector on the outs, gave a boost to green cement startup Terra CO2.

Keep reading...Show less
Green