You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
Five years ago, the world met the Model Y. Tesla officially unveiled its smaller crossover in March 2019 and, the next year, began to sell the car in staggering numbers. The Model Y helped Tesla tighten its grip on the electric vehicle market. By 2023 it had displaced the Toyota Corolla as the world’s best-selling car of any kind.
It’s not easy to follow up a massive success. What’s worse is having no plan at all — or being chronically unable to stick to one. That’s where Tesla seems to be amid yet another shakeup at the company.
This week, Tesla announced it would lay off 10% of its worldwide staff, while some influential leaders are leaving of their own accord. The news comes as Tesla has just announced a sales dip and prognosticators wonder whether we’re entering an “EV winter” where more buyers choose hybrids instead. Now, this is neither the first time Tesla has run into difficulty nor the first time the EV maker has commenced mass layoffs to cut costs. Somehow, though, this time feels different.
During the Model’s Y’s ascendance over the past half-decade, Tesla’s path forward to the next thing has turned into a mess of distractions and left turns. Musk became obsessed with and then purchased Twitter, a boondoggle of a deal that clearly distracted him from his car company. The oft-touted Roadster supercar has yet to materialize.
More importantly, the long-promised $25,000 car appears to be dead (or at least tabled indefinitely). Musk had initially indicated the affordable Tesla would finally arrive next year, leaving the company to endure a single gap year without a new vehicle. But Reuters reported that Tesla has killed the idea in part because of competition overseas from ultra-cheap Chinese EVs, and while Musk responded to the report by saying Reuters was “lying,” he’s done nothing to indicate the “Model 2” is anything but dead.
Meanwhile, the only new-ish vehicle in the Tesla lineup, the Cybertruck, is stuck. Tesla stopped deliveries of the steel beast for an unknown issue, rumored to be related to sticky accelerator problems, and shortened production shifts at the factory. And while it’s possible to squint and see a case for the Cybertruck, I’ve written here numerous times that Tesla’s big mistake wasn’t putting that eyesore on the road. Instead, it was focusing the company’s attention on something so adolescent and absurd at a moment when it could have tightened its grip on the EV market, and given American EV drivers some interesting new options, by rolling out new cars that look more like something the average American would want to buy.
So what is Tesla up to? In a follow-up tweet after he attacked Reuters, Musk suddenly announced that he would reveal the company’s “robotaxi” on August 8. This would be Tesla’s completely self-driven vehicle. Musk’s favorite claim about the car is that it would earn its owners passive income by driving itself around, picking up and dropping off passengers as a kind of dystopian Uber.
The dream certainly fits in with Musk’s oeuvre. The CEO clearly still sees Tesla as a lean startup that moves fast and breaks things, not an established car company that would do something so expected and bland as building a perfectly acceptable three-row family crossover to compete with the Kia EV9. Compare that to the way other companies born of Silicon Valley began to act once they got big. Apple may have engaged in a long, fruitless dalliance with the self-driving car, but ultimately, it knows its bread is buttered by iterating on everything in the iPhone ecosystem. Facebook may have changed its name to Meta to highlight its mission to create the metaverse, but it still leaned into the revenue engines it built or acquired, like Instagram or Whatsapp.
It’s fine to tell yourself a story about who you want to be. And to give Tesla the benefit of the doubt for a moment: sure, maybe it will be the one to crack full autonomous driving. But in practical terms, that tech is not close to reality, and Tesla’s version of it has encountered its fair share of bugs and been sued over crashes.
(In the spirit of “robotaxi,” the company just offered a month-long free trial of Full Self Driving to Tesla drivers. I will certainly not be using it with a young child in the car. The brand has also mandated that potential drivers be given a demo of FSD during test-drives, seemingly to hammer home the idea that Tesla is just a few steps away from having the car drive you home while you take a nap.)
In the meantime, you have to wonder just what Tesla is going to sell to humans who want a plain old electric car. It recently completed a refresh for the Model 3, and while the new one looks a little like next year’s iPhone — the same product with a facelift and a couple new features — you’d expect to see a similar update coming to the Model Y.
It’s important to remember: Despite the ill wishes from his online haters, Musk isn’t exactly dead in the water. Tesla sold 220,000 Model 3s in America last year and nearly 400,000 Model Ys, a huge lead over competitor EV from legacy car brands. Those companies are hitting the same EV headwinds as Tesla this year, while golden child Rivian is still at least a couple of years away from selling its exciting smaller SUVs. Tesla is the established giant in electric cars, even as it looks in the mirror and sees an upstart.
Yet with Cybertruck landing with a thud, and no obvious follow-up in the works, Tesla is in danger of blowing that huge lead. It may want to transform into a software company, and to earn back some of Musk’s Iron Man sci-fi cred by realizing the self-driving car. But at this moment, it feels a little like an electric car company that forgot it makes cars.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
A counter-proposal for the country’s energy future.
American electricity consumption is growing for the first time in generations. And though low-carbon technologies such as solar and wind have scaled impressively over the past decade, many observers are concerned that all this new demand will provide “a lifeline for more fossil fuel production,” as Senator Martin Heinrich put it.
In response, a few policy entrepreneurs have proposed novel regulations known as “additionality” requirements to handle new sources of electric load. First suggested for electrolytic hydrogen, additionality standards would require that subsidized hydrogen producers source their electricity directly from newly built low-carbon power plants; in a Heatmap piece from September, Brian Deese and Lisa Hansmann proposed similar requirements for new artificial intelligence. And while AI data centers were their focus, the two argued that additionality “is a model that can be extended to address other sectors facing growing energy demand.”
There is some merit to additionality standards, particularly for commercial customers seeking to reduce their emissions profile. But we should be skeptical of writing these requirements into policy. Strict federal additionality regulations will dampen investment in new industries and electrification, reduce the efficiency of the electrical grid through the balkanization of supply and demand, and could become weapons as rotating government officials impose their views on which sources of demand or supply are eligible for the standards. The grid and the nation need a regulatory framework for energy abundance, not burdensome additionality rules.
After decades of end-use efficiency improvements, offshoring of manufacturing, and shifts toward less material-intensive economies, a confluence of emerging factors are pushing electricity demand back up again. For one, the nation is electrifying personal vehicles, home heating, and may do the same for industrial processes like steel production in the not-too-distant future, sparked by a combination of policy and commercial investment. Hydrogen, which has long been a marginal fuel, is attractingsubstantial interest. And technological innovation is leading to whole new sources of electric load — compute-hungry artificial intelligence beingthe most immediate example, but also large-scale critical minerals refining, indoor agriculture like alternative protein cultivation and aquaculture, and so on.
In recent years, clean energy has seemed to be on an unstoppable path toward dominating the power sector. Coal-fired generation has been in terminal decline in the United States as natural gas power plants and solar and wind farms have become more competitive. Flexible gas generation, likewise, is increasingly crowded out by renewables when the wind is blowing and the sun shining. These trends persisted in the context of stable electricity load. But even as deployment accelerates, low-carbon electricity supply may not be able to keep up with the surprisingly robust growth in demand. The most obvious — though not the exclusive — way for utilities and large corporates to meet that demand is often with new or existing natural gas capacity. Even a few coal plants have delayed retirement, reportedly in response to rising demand and reliability concerns.
Given the durable competitiveness of coal and especially natural gas, some form of additionality requirement might make sense for hydrogen production in particular, since hydrogen is not just a nascent form of electric load but a novel fuel in its own right. Simply installing an electrolyzer at an existing coal or natural gas plant could produce hydrogen that, from a lifecycle perspective, would result in higher carbon emissions, even if it displaces fossil fuels like gas or oil in final consumption. Even so, many experts caution that overly strict additionality standards for hydrogen at this stage are overkill, and may smother the industry in its crib.
Likewise, large corporate entities and electricity customers adopting additionality requirements for their own operations can bolster investment in so-called “clean firm” generation like nuclear, geothermal, and fossil fuels with carbon capture. In just the past month, Google announced plans to back the construction of new small nuclear reactors, and Microsoft announced plans to purchase electricity for new data centers from the shuttered Three Mile Island power plant, the plant made famous by the 1979 meltdown but which only closed down in 2019. Three Mile Island’s $100-per-megawatt-hour price tag would have been unthinkable just a few years ago but is newly attractive.
Notice the problem Microsoft is trying to solve here: a lack of abundant, reliable electricity generation. Outdated technology licensing, onerous environmental permitting processes, and other regulatory barriers are obstructing the deployment of renewables, advanced nuclear energy, new enhanced geothermal technologies, and low-carbon sources. Additionality fixes none of these issues. Of course, Deese and Hansmann propose “a dedicated fast-track approval process” for verifiably additional low-carbon generation supplying new sources of AI load. Yet this should be the central effort, not the after-the-fact add-on. The back and forth over additionality rules for the clean hydrogen tax credit is a case in point. The rules for the tax credit will (likely) be finalized by January, but lawsuits already loom over them. Expanding this contentious additionality requirement to apply to broad use cases will be even more contentious without solving the actual shortage data center companies care about. Conversations about additionality are a distraction and misplace the energies of policymakers and staff.
Substituting one regulatory thicket for another is a recipe for stasis. Instead of adding more red tape, we should be working to cut through it, fast-tracking the energy transition and fostering abundance.
With such broad requirements, what’s to stop future administrations from expanding them to cover electric vehicle charging, electric arc furnace steelmaking, alternative protein production, or any politically disfavored source of new demand? Could a second Trump Administration use additionality to punish political enemies in the tech industry? Could a Harris Administration do the same? What if a future administration maintained additionality standards for new sources of load, but required that the electricity come from fossil fuels instead of low-carbon sources?
Zero-sum regulatory contracts between sources of electricity supply and demand are not simply at risk of becoming a tool for handing out favors on a partisan basis — they already are one. Two pieces of model legislation proposed at the July meeting of the American Legislative Exchange Council, an organization of conservative state legislators that collaborate to write off-the-shelf legislative measures, would require public utility commissions to prioritize dispatchable generation and formally discourage intermittent renewable sources like solar and wind. One of the proposals suggests leaning on state attorneys general to extend the lifespans of coal plants threatened with retirement.
These proposals did not move forward this year, but it is unlikely that the motivating force behind them is exhausted. And whatever one thinks of the relative merits of intermittent versus firm generation, ALEC’s proposals demonstrate just how easily gamed regulations like additionality could be and the risks of relying on administrative discretion instead of universal, pragmatic rules.
This is not how the electric grid is supposed to work. The grid is, if not an according-to-Hoyle public good, a shared public resource, providing essential services to customers large and small. Homeowners don’t have to sign additionality contracts with suppliers when they buy an electric car or replace their gas furnace with an electric heat pump. Everyone understands that such requirements would slow the pace of electrification and investment in new industries. The same holds for corporate customers and novel sources of load.
The real problem facing the AI, hydrogen, nuclear, geothermal, and renewables industries is an inability to build. There are more than enough clean generators queueing to enter the system — 2.6 terawatts at last count, according to the Lawrence Berkeley National Laboratory. The unfortunate reality, however, is that just one in five of these projects will make it through — and those represent just 14% of the capacity waiting to connect. Still, this totals about 360 gigawatts of new energy generation over the next few years, much more than the predicted demand from AI data centers. Obstacles to technology licensing, permitting, interconnection, and transmission are the key bottlenecks here.
Would foregoing additionality requirements and loosening regulatory strictures on technology licensing and permitting increase the commercial viability of new or existing fossil fuel capacity, as Deese and Hansmann warn? Perhaps, on some margin. But for the foreseeable future, the energy projects and infrastructure most burdened by regulatory requirements will be low-carbon ones. Batteries, solar, and wind projects make up more than 80% of the queue added in 2023. Meanwhile, oil and gas benefit from categorical exclusions under the National Environmental Policy Act, while low-carbon technologies are subject to stricter standards (although three permitting bills recently passed the House, including one that waives these requirements for new geothermal projects).
Consider that 40% of projects supported by the Inflation Reduction Act are caught up in delays. That is $84 billion of economic activity just waiting for the paperwork to be figured out, according to the Financial Times. Additionality requirements are additional boxes to check that almost necessarily imply additional delays. Permitting reform makes them redundant and unnecessary for a cleaner future.
This underscores perhaps the most essential conflict between strict additionality requirements and clean energy abundance. Ensuring that every new policy and every new source of demand allows for absolutely zero additional fossil fuel consumption or emissions will prove counterproductive to global decarbonization in the long run. Natural gas is still reducing emissions on the margin in the United States. Over the past decade, in years with higher natural gas prices, coal generation has ticked up, indicating that the so-called “natural gas bridge” has not yet reached its terminus. Even aggressive decarbonization scenarios now expect a substantial role for natural gas over the coming decades. And in the long term, natural gas plants may prove wholly compatible with abundant, low-carbon electricity systems if next-generation carbon capture technologies prove scalable.
The United States is the world’s energy technology R&D and demonstration laboratory. If policies to prune marginal fossil fuel consumption here stall domestic investment and scaling of low-carbon technologies — as current permitting regulations already do, and proposed additionality requirements would do — then we will not only slow U.S. decarbonization, but also inhibit our ability to export affordable and scalable low-carbon technologies abroad.
Environmental progress’s surest path is in speeding up. For that to happen, we need processes that allow for rapid deployment of clean energy solutions. Expediting technology licensing, fast-tracking federal infrastructure permitting, and finding opportunities for quicker and more rational interconnections should be first and foremost.
The real solution lies in building a regulatory environment where energy abundance can flourish. Clearing the path for clean energy development, we can achieve a future where energy is affordable, reliable, and abundant—a future where the United States leads in both decarbonization and economic growth. It’s time to stop adding barriers and start speeding up progress.
Daron Acemoglu and William Nordhaus have some disagreements.
This year’s Economics Nobel is not a climate prize — that happened in 2018, when Yale economist William Nordhaus won the prize for his work on modeling the effects of climate change and economic growth together, providing the intellectual basis for carbon taxation and more generally for regulating greenhouse gas emissions because of the “social cost” they impose on everyone.
Instead, this year’s prize, awarded to MIT’s Daron Acemoglu and Simon Johnson and University of Chicago’s James Robinson is for their work demonstrating “the importance of societal institutions for a country’s prosperity,” i.e. why some countries are rich and others are poor. To do so, the trio looked at the history of those countries’ institutions — laws, modes of government, relationship between the state and individuals — and drew out which are conducive to wealth and which lead to poverty.
Long story short, “extractive” institutions set up to reward a narrow elite tend to hurt economic development over time, as in much of Africa, which was colonized by Europeans who didn’t actually live there. “Inclusive” institutions, by contrast, arose in the United States and Canada, where there was significantly more European migration, thus incentivizing the ruling elite to set up institutions that benefitted a broader range of (again, European) residents.
While this research rests heavily on the climate (the reason Europeans avoided African colonies was because of the high rate of disease in tropical climates), it does not touch on climate change specifically. But Acemoglu especially is an incredibly wide-ranging scholar and has devoted some time to the specific questions of climate change — and in so doing has been a direct critic of Nordhaus, Stockholm’s preferred climate economist.
“Existing approaches in economics still do not provide the right framework for managing the problems that will confront us over the next several decades,” Acemoglu wrote in a 2021 essay titled “What Climate Change Requires of Economics,” referring directly to Nordhaus’s Nobel-winning work. “Although the economics discipline has evolved over time to acknowledge environmental risks and costs, it has yet to rise to the challenge of climate change. A problem as massive as this one will require a fundamental reconsideration of some of the field's most deeply held assumptions.”
His criticisms included that Nordhaus’s more gradualistic approach — the latest version of his model spits out that a 1.5 degree Celsius warming target is “infeasible,” and the “cost vs. benefit optimal” amount of warming as 2.6 degrees Celsius over pre-industrial levels with a carbon price that rises to $115 per ton by 2050 — ignores both the best way to reduce emissions and the risk of not doing so fast enough.
Acemoglu is far more optimistic about how policy can direct technological development and less sanguine about additional warming over and above the Paris Agreement limits. He argues that the possibility of theoretical “tipping points,” where exceeding certain climate thresholds by even a small amount may cause dramatic damages, make the risk of such overshoot far too great.
He also took issue with the discount rate applied to spending later vs. spending now in Nordhaus’s models. The basic idea is that a dollar spent today to mitigate the effects of climate change is more valuable than one spent in 2050. But the rates Nordhaus uses — which he derives from real-world investment returns — implies that in order for spending now to be worth it later, the benefits in 2050 or 2100 must be very, very large.
“There is a plausible economic (and philosophical) case to be made for why future essential public goods should be valued differently than private goods or other types of public consumption,” Acemoglu wrote in 2021, arguing that discount rates derived from investment returns, like the ones Nordhaus uses, might not be the best guide to public policy.
So what does the latest Nobel laureate want instead? Well, something like what the United States has been doing the past few years.
Accounting for the economic benefits of domestic or “endogenous” technological development, Acemoglu’s research finds that "the transition to cleaner energy is much more important than simply reducing energy consumption, and that technological interventions need to be redirected far more aggressively than they have been.” He explored how this process could work in papers he wrote over more than a decade, developing a model for this kind of directed technological change and applying it to the United States, starting as far back as 2012.
Across all his work on climate change, Acemoglu argues that a focus on pricing the “externalities” of carbon emissions — the harm emissions impose on everyone that isn’t reflected in the prices of fossil fuels — is myopic. Instead, the challenge is both restricting emissions and fostering clean technologies that can take the place of dirty ones, which have had a remarkable head start in investment.
In “The Environment and Directed Technical Change,” published in 2012 and co-written with Philippe Aghion, Leonardo Bursztyn, and David Hemous, Acemoglu argues that a mixture of carbon taxes and research subsides could “redirect technical change and avoid an environmental disaster” by imposing a cost on dirty technology and boosting clean technology.
Such an approach would probably rest heavily on positive subsidies and encouraging clean technology and less on a carbon tax, the four write (although a carbon tax would still help to “discourage research” into polluting technologies). It would also need to happen soon.
“Directed technical change also calls for immediate and decisive action in contrast to the implications of several exogenous technology models used in previous economic analyses.”
This framework does not precisely match United States policy — we have no carbon tax — but it does somewhat approximate it. The Biden administration’s approach to climate policy centers on large-scale investments in clean technologies, whether they’re tax credits for non-carbon-emitting electricity production or financing for clean energy projects from the Loan Programs Office, combined with a suite of Environmental Protection Agency rules that are intended to reduce pollution from fossil fuel power plants (along with an actual direct fee on methane emissions).
This approach is embedded within an overall industrial policy that’s supposed to make the economy more productive — a counter-argument to the idea that climate spending is an economic drag that trades off with environmental harms in the future. Acemoglu, too, questions the idea that there’s a tradeoff between economic growth and spending to combat climate change. Not only could renewables be cheaper than fossil fuels, “an energy transition can improve productive capacity and thus lead to an expansion of output, because transition to cleaner technologies can boost investment and the rate of technological progress,” he and his co-authors write.
Acemoglu has also weighed in on one the more controversial questions in climate policy and economics: the shale gas boom. In a 2023 paper written, again with Aghion, Hemous, and Lint Barrage, he weighed the effects of dramatic increase of domestically extracted natural gas, focusing on the importance of technological development. The Environmental Protection Agency attributes the decline in US greenhouse gas emissions since 2010 in part to “the growing use of natural gas and renewables to generate electricity in place of more carbon-intensive fuels,” due to natural gas replacing coal electricity generation. While this logic has come under fire from some activists and researchers who say the government’s models underestimate methane leakage from natural gas operations, Acemoglu took a different tack.
Yes, natural gas substituting for coal reduces short-run emissions, he and his co-authors concluded, but also, “the natural gas boom discourages innovation directed at clean energy, which delays and can even permanently prevent the energy transition to zero carbon.” They backed up this assertion by pointing to a decline in the total share of patents rewarded to renewable energy innovation between 2009 and 2016.
The way out is that same mix of carbon prices and technology subsidies Acemoglu has been recommending in some form since Kelly Clarkson was last on top of the charts, which “enables emission reductions in the short run, while optimal policy would ensure that the long-run green transition is not disrupted.”
If the Biden Administration’s climate policy works out, it will look something like that, and the prize will be far greater than anything given out in Stockholm.
It’s flawed, but not worthless. Here’s how you should think about it.
Starting this month, the tens of millions of Americans who browse the real-estate listings website Zillow will encounter a new type of information.
In addition to disclosing a home’s square footage, school district, and walkability score, Zillow will begin to tell users about its climate risk — the chance that a major weather or climate event will strike in the next 30 years. It will focus on the risk from five types of dangers: floods, wildfires, high winds, heat, and air quality.
The data has the potential to transform how Americans think about buying a home, especially because climate change will likely worsen many of those dangers. About 70% of Americans look at Zillow at some point during the process of buying a home, according to the company.
“Climate risks are now a critical factor in home-buying decisions,” Skylar Olsen, Zillow’s chief economist, said in a statement. “Healthy markets are ones where buyers and sellers have access to all relevant data for their decisions.”
That’s true — if the information is accurate. But can homebuyers actually trust Zillow’s climate risk data? When climate experts have looked closely at the underlying data Zillow uses to assess climate risk, they have walked away unconvinced.
Zillow’s climate risk data comes from First Street Technology, a New York-based company that uses computer models to estimate the risk that weather and climate change pose to homes and buildings. It is far and away the most prominent company focused on modeling the physical risks of climate change. (Although it was initially established as a nonprofit foundation, First Street reorganized as a for-profit company and accepted $46 million in investment earlier this year.)
But few experts believe that tools like First Street’s are capable of actually modeling the dangers of climate change at a property-by-property level. A report from a team of White House scientific advisors concluded last year that these models are of “questionable quality,” and a Bloomberg investigation found that different climate risk models could return wildly different catastrophe estimates for the same property.
Not all of First Street’s data is seen as equally suspect. Its estimates of heat and air pollution risk have generally attracted less criticism from experts. But its estimates of flooding and wildfire risk — which are the most catastrophic events for homeowners — are generally thought to be inadequate at best.
So while Zillow will soon tell you with seeming precision that a certain home has a 1.1% chance of facing a wildfire in the next 30 years, potential homebuyers should take that kind of estimate with “a lot of grains of salt,” Michael Wara, a senior research scholar at the Stanford Woods Institute for the Environment, told me.
Here’s a short guide for how to think through Zillow’s estimates of climate risk.
Neither First Street nor Zillow immediately responded to requests for comment.
Zillow has said that, when the data is available, it will tell users whether a given home has flooded or burned in a wildfire recently. (It will also say whether a home is near a source of air pollution.)
Homebuyers should take that information seriously, Madison Condon, a Boston University School of Law professor who studies climate change and financial markets, told me.
“If the house flooded in the recent past, then that should be a major red flag to you,” she said. Houses that have flooded recently are very likely to flood again, she said. Only 10 states require a home seller to disclose a flood to a potential buyer.
First Street claims that its physics-based models can identify the risk that any individual property will flood. But the ability to determine whether a given house will flood depends on having an intricate knowledge of local infrastructure, including stormwater drains and what exists on other properties, and that data does not seem to exist in anyone’s model at the moment, Condon said.
When Bloombergcompared the output of three different flooding models, including First Street’s, they agreed on results for only 5% of properties.
If you’re worried about a home’s flood risk, then contact the local government and see if you can look at a flood map or even talk to a flood manager, Condon said. Many towns and cities keep flood maps in their records or on their website that are more granular than what First Street is capable of, she said.
“The local flood manager who has walked the property will almost always have a better grasp of flood risk than the big, top-down national model,” she said.
In some cases, Zillow will recommend that a home buyer purchase federal flood insurance. That’s generally not a bad idea, Condon said, even if Zillow reaches that conclusion using national model data that has errors or mistakes.
“It simply is true that way more people should be buying flood insurance than generally think they should,” she said. “So a general overcorrection on that would be good.”
If you’re looking at buying a home in a wildfire-prone area, especially in the American West, then you should generally assume that Zillow is underestimating its wildfire risk, Wara, the Stanford researcher, told me.
That’s because computer models that estimate wildfire risk are in a fairly early stage of development and improving rapidly. Even the best academic simulations lack the kind of granular, structure-level data that would allow them to predict a property’s forward-looking wildfire risk.
That is actually a bigger problem for homebuyers than for insurance companies, he said. A home insurance company gets to decide whether to insure a property every year. If it looks at new science and concludes that a given town or structure is too risky, then it can raise its premiums or even simply decline to cover a property at all. (State Farm stopped selling home insurance policies in California last year, partly because of wildfire risk.)
But when homeowners buy a house, their lives and their wealth get locked into that property for 30 years. “Maybe your kids are going to the school district,” he said. It’s much harder to sell a home when you can’t get it covered. “You have an illiquid asset, and it’s a lot harder to move.”
That means First Street’s wildfire risk data should be taken as “absolute minimum estimate,” Wara said. In a wildfire-prone area, “the real risk is most likely much higher” than its models say.
Over the past several years, runaway wildland fires have killed dozens of people or destroyed tens of thousands of homes in Lahaina, Hawaii; Paradise, California; and Marshall, Colorado.
But in those cases, once the fire began incinerating homes, it ceased to be a wildland fire and became a structure-to-structure fire. The fire began to leap from house to house like a book of matches, condemning entire neighborhoods to burn within minutes.
Modern computer models do an especially poor job of simulating that transition — the moment when a wildland fire becomes an urban conflagration, Wara said. Although it only happens in perhaps 0.5% of the most intense fires, those fires are responsible for destroying the most homes.
But “how that happens and how to prevent that is not well understood yet,” he said. “And if they’re not well understood yet from a scientific perspective, that means it’s not in the [First Street] model.”
Nor do the best university wildfire models have good data on every individual property’s structural-level details — such as what material its walls or roof are made of — that would make it susceptible to fire.
When assessing whether your home faces wildfire risk, its structure is very important. But “you have to know what your neighbor’s houses look like, too, within about a 250-yard radius. So that’s your whole neighborhood,” Wara said. “I don’t think anyone has that data.”
A similar principle goes for thinking about flood risk, Condon said. Your home might not flood, she said, but it also matters whether the roads to your house are still driveable or whether the power lines fail. “It’s not particularly useful to have a flood-resilient home if your whole neighborhood gets washed out,” she said.
Experts agree that the most important interventions to discourage wildfire — or, for that matter, floods — have to happen at the community level. Although few communities are doing prescribed burns or fuel reduction programs right now, some are, Wara said.
But because nobody is collecting data about those programs, national risk models like First Street’s would not factor those programs into an area’s wildfire risk, he said. (In the rare case that a government isclearing fuel or doing a prescribed burn around a town, wildfire risk there might actually be lower than Zillow says, Wara added.)
Going forward, figuring out a property’s climate risk — much like pushing for community-level resilience investment — shouldn’t be left up to individuals, Condon said.
The state of California is investing in a public wildfire catastrophe model so that it can figure out which homes and towns face the highest risk. She said that Fannie Mae and Freddie Mac, the federal entities that buy home mortgages, could invest in their own internal climate-risk assessments to build the public’s capacity to understand climate risk.
“I would advocate for this not to be an every-man-for-himself, every-consumer-has-to-make-a-decision situation,” Condon said.