You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
AI has already changed weather forecasting forever.
It’s been a wild few years in the typically tedious world of weather predictions. For decades, forecasts have been improving at a slow and steady pace — the standard metric is that every decade of development leads to a one-day improvement in lead time. So today, our four-day forecasts are about as accurate as a one-day forecast was 30 years ago. Whoop-de-do.
Now thanks to advances in (you guessed it) artificial intelligence, things are moving much more rapidly. AI-based weather models from tech giants such as Google DeepMind, Huawei, and Nvidia are now consistently beating the standard physics-based models for the first time. And it’s not just the big names getting into the game — earlier this year, the 27-person team at Palo Alto-based startup Windborne one-upped DeepMind to become the world’s most accurate weather forecaster.
“What we’ve seen for some metrics is just the deployment of an AI-based emulator can gain us a day in lead time relative to traditional models,” Daryl Kleist, who works on weather model development at the National Oceanic and Atmospheric Administration, told me. That is, today’s two-day forecast could be as accurate as last year’s one-day forecast.
All weather models start by taking in data about current weather conditions. But from there, how they make predictions varies wildly. Traditional weather models like the ones NOAA and the European Centre for Medium-Range Weather Forecasts use rely on complex atmospheric equations based on the laws of physics to predict future weather patterns. AI models, on the other hand, are trained on decades of prior weather data, using the past to predict what will come next.
Kleist told me he certainly saw AI-based weather forecasting coming, but the speed at which it’s arriving and the degree to which these models are improving has been head-spinning. “There's papers coming out in preprints almost on a bi-weekly basis. And the amount of skill they've been able to gain by fine tuning these things and taking it a step further has been shocking, frankly,” he told me.
So what changed? As the world has seen with the advent of large language models like ChatGPT, AI architecture has gotten much more powerful, period. The weather models themselves are also in a cycle of continuous improvement — as more open source weather data becomes available, models can be retrained. Plus, the cost of computing power has come way down, making it possible for a small company like Windborne to train its industry-leading model.
Founded by a team of Stanford students and graduates in 2019, Windborne used off-the-shelf Nvidia gaming GPUs to train its AI model, called WeatherMesh — something the company’s CEO and co-founder, John Dean, told me wouldn’t have been possible five years ago. The company also operates its own fleet of advanced weather balloons, which gather data from traditionally difficult-to-access areas.
Standard weather balloons without onboard navigation typically ascend too high, overinflate, and pop within a matter of hours (thus becoming environmental waste, sad!). Since it’s expensive to do launches at sea or in areas without much infrastructure, there’s vast expanses of the globe where most balloons aren’t gathering any data at all.
Satellites can help, of course. But because they’re so far away, they can’t provide the same degree of fidelity. With modern electronics, though, Windborne found it could create a balloon that autonomously changes altitude and navigates to its intended target by venting gas to descend and dropping ballast to ascend.
“We basically took a lot of the innovations that lead to smartphones, global satellite communications, all of the last 20 years of progress in consumer electronics and other things and applied that to balloons,” Dean told me. In the past, the electronics needed to control Windborne’s system would have been too heavy — the balloon wouldn’t have gotten off the ground. But with today’s tiny tech, they can stay aloft for up to 40 days. Eventually, the company aims to recover and reuse at least 80% of its balloons.
The longer airtime allows Windborne to do more with less. While globally there are more than 1,000 conventional weather balloons launched every day, Dean told me, “We collect roughly on the order of 10% or 20% of the data that NOAA collects every day with only 100 launches per month.” In fact, NOAA is a customer of the startup — Windborne already makes millions in revenue selling its weather balloon data to various government agencies.
Now, with a potentially historic hurricane season ramping up, Windborne has the potential to provide the most accurate data on when and where a storm will touch down.
Earlier this year, the company used WeatherMesh to run a case study on Hurricane Ian, the Category 5 storm that hit Florida in September 2022, leading to over 150 fatalities and $112 billion in damages. Using only weather data that was publicly available at the time, the company looked at how accurately its model (had it existed back then) would have tracked the hurricane.
Very accurately, it turns out. Windborne’s predictions aligned neatly with the storm’s actual path, while the National Weather Service’s model was off by hundreds of kilometers. That impressed Khosla Ventures, which led the company’s $15 million Series A funding round earlier this month. “We haven’t seen meaningful innovation in weather since The Weather Channel in the 90s. Yet it’s a $100 billion market that touches essentially every industry,” Sven Strohband, a partner and managing director at Khosla Ventures, told me via email.
With this new funding, Windborne is scaling up its fleet of balloons as it prepares to commercialize. The money will also help Windborne advance its forecasting model, though Dean told me robust data collection is ultimately what will set the company apart. “In any kind of AI industry, whoever has the top benchmark at any given time, it’s going to fluctuate,” Dean said. “What matters is the model plus the unique datasets.”
Unlike Windborne, the tech giants with AI-based weather models — including, most recently, Microsoft — aren’t gathering their own data, instead drawing solely on publicly accessible information from legacy weather agencies.
But these agencies are starting to get into the game, too. The European Centre for Medium-Range Weather Forecasts has already created its own AI-based model, the Artificial Intelligence/Integrated Forecasting System, which it runs in parallel to its traditional model. NOAA, while a bit behind, is also looking to follow suit.
“In the end, we know we can't rely on these big tech companies to just keep developing stuff in good faith to give to us for free,” Kleist told me. Right now, many of the top AI-based weather models are open source. But who knows if that will last? “It's our mission to save lives and property. And we have to figure out how to do some of this development and operationalize it from our side, ourselves,” Kleist said, explaining that NOAA is currently prototyping some of its own AI-based models.
All of these agencies are in the early stages of AI modeling, which is why you likely haven’t noticed weather predictions making a pronounced leap in accuracy as of late. It’s all still considered quite experimental. “Physical models, the pro is we know the underlying assumptions we make. We understand them. We have decades of history of developing them and using them in operational settings,” Kleist told me. AI-based models are much more of a black box, and there’s questions surrounding how well they will perform when it comes to predicting rare weather events, for which there might be little to no historical data for the model to reference.
That hesitation might not last long, though. “To me it’s fairly obvious that most of the forecasts that would actually be used by users in the future will come from machine learning models,” Peter Dueben, head of Earth systems modeling at the European Centre for Medium Range Weather Forecasting, told me. “If you just want to get the weather forecast for the temperature in California tomorrow, then the machine learning model is typically the better choice,” he added.
That increased accuracy is going to matter a lot, not just for the average weather watcher, but also for specific industries and interest groups for whom precise predictions are paramount. “We can tailor the actual models to particular sectors, whether it's agriculture, energy, transportation,” Kleist told me, “and come up with information that's going to be at a very granular, specific level to a particular interest.” Think grid operators or renewable power generators who need to forecast demand or farmers trying to figure out the best time to irrigate their fields or harvest crops.
A major (and perhaps surprising) reason this type of customization is so easy is because once AI-based weather models are trained, they’re actually orders of magnitude cheaper and less computationally intensive to run than traditional models. All of this means, Kleist told me, that AI-based weather models are “going to be fundamentally foundational for what we do in the future, and will open up avenues to things we couldn't have imagined using our current physical-based modeling.”
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Recovering from the Los Angeles wildfires will be expensive. Really expensive. Insurance analysts and banks have already produced a wide range of estimates of both what insurance companies will pay out and overall economic loss. AccuWeatherhas put out an eye-catching preliminary figure of $52 billion to $57 billion for economic losses, with the service’s chief meteorologist saying that the fires have the potential to “become the worst wildfire in modern California history based on the number of structures burned and economic loss.” On Thursday, J.P. Morgan doubled its previous estimate for insured losses to $20 billion, with an economic loss figure of $50 billion — about the gross domestic product of the country of Jordan.
The startlingly high loss figures from a fire that has only lasted a few days and is (relatively) limited in scope show just how distinctly devastating an urban fire can be. Enormous wildfires thatcover millions of acres like the 2023 Canadian wildfires can spew ash and particulate matter all over the globe and burn for months, darkening skies and clogging airways in other countries. And smaller — and far deadlier fires — than those still do not produce the same financial roll.
It’s in coastal Southern California where you find large population centers areas known by all to be at extreme risk of fire. And so a fire there can destroy a whole neighborhood in a few hours and put the state’s insurance system into jeopardy.
One reason why the projected economic impacts of the fires are so high is that the structures that have burned and the land those structures sit on are very valuable. Pacific Palisades, Malibu, and Santa Monica contain some of the most sought-after real estate on planet earth, with typical home prices over $2 million. Pacific Palisades itself has median home values of around $3 million, according to JPMorgan Chase.
The AccuWeather estimates put the economic damage for the Los Angeles fires at several times previous large, urban fires — the Maui wildfire in 2023 was estimated to cause around $14 billion of economic loss, for example — while the figure would be about a third or a quarter of a large hurricane, which tend to strike areas with millions of people in them across several states.
“The fires have not been contained thus far and continue to spread, implying that estimates of potential economic and insured losses are likely to increase,” the JPMorgan analysts wrote Thursday.
That level of losses would make the fires costlier in economic terms than the 2018 Butte County Camp Fire, whose insured losses of $10 billion made it California’s costliest at the time. That fire was far larger than the Los Angeles fires, spreading over 150,000 acres compared to just over 17,000 acres for the Palisades Fire and over 10,000 acres for the Eaton Fire. It also led to more than 80 deaths in the town of Paradise.
So far, around 2,000 homes have been destroyed,according to the Los Angeles Times,a fraction of the more than 19,000 structures affected by the Camp Fire. The difference in estimated losses comes from the fact that homes in Pacific Palisades weigh in at more than six times those in rural Butte, according to JPMorgan.
While insured losses get the lion’s share of attention when it comes to the cost impacts of a natural disaster, the potential damages go far beyond the balance sheet of insurers.
For one, it’s likely that many affected homeowners did not even carry insurance, either because their insurers failed to renew their existing policies or the homeowners simply chose to go without due to the high cost of what insurance they could find. “A larger than usual portion of the losses caused by the wildfires will be uninsured,” according to Morningstar DBRS, which estimated total insured losses at more than $8 billion. Many homeowners carry insurance from California’s backup FAIR Plan, which may itself come under financial pressure, potentially leading to assessments from the state’s policyholders to bolster its ability to pay claims.
AccuWeather arrived at its economic impact figure by looking not just at losses from property damage but also wages that go unearned due to economic activity slowing down or halting in affected areas, infrastructure that needs to be repaired, supply chain issues, and transportation snarls. Even when homes and businesses aren’t destroyed, people may be unable to work due to evacuations; businesses may close due to the dispersal of their customers or inability of their suppliers to make deliveries. Smoke inhalation can lead to short-, medium-, and long-term health impacts that take a dent out of overall economic activity.
The high level of insured losses, meanwhile, could mean that insurers’ will see less surplus and could have to pay more for reinsurance, Nancy Watkins, an actuary and wildfire expert at Milliman, told me in an email. This may mean that they would have to shed yet more policies “in order to avoid deterioration in their financial strength ratings,” just as California has been trying to lure insurers back with reforms to its dysfunctional insurance market.
The economic costs of the fire will likely be felt for years if not decades. While it would take an act of God far stronger than a fire to keep people from building homes on the slopes of the Santa Monica Mountains or off the Pacific Coast, the city that rebuilds may be smaller, more heavily fortified, and more expensive than the one that existed at the end of last year. And that’s just before the next big fire.
Suburban streets, exploding pipes, and those Santa Ana winds, for starters.
A fire needs three things to burn: heat, fuel, and oxygen. The first is important: At some point this week, for a reason we have yet to discover and may never will, a piece of flammable material in Los Angeles County got hot enough to ignite. The last is essential: The resulting fires, which have now burned nearly 29,000 acres, are fanned by exceptionally powerful and dry Santa Ana winds.
But in the critical days ahead, it is that central ingredient that will preoccupy fire managers, emergency responders, and the public, who are watching their homes — wood-framed containers full of memories, primary documents, material wealth, sentimental heirlooms — transformed into raw fuel. “Grass is one fuel model; timber is another fuel model; brushes are another — there are dozens of fuel models,” Bobbie Scopa, a veteran firefighter and author of the memoir Both Sides of the Fire Line, told me. “But when a fire goes from the wildland into the urban interface, you’re now burning houses.”
This jump from chaparral shrubland into neighborhoods has frustrated firefighters’ efforts to gain an upper hand over the L.A. County fires. In the remote wilderness, firefighters can cut fire lines with axes, pulaskis, and shovels to contain the blaze. (A fire’s “containment” describes how much firefighters have encircled; 25% containment means a quarter of the fire perimeter is prevented from moving forward by manmade or natural fire breaks.)
Once a fire moves into an urban community and starts spreading house to house, however, as has already happened in Santa Monica, Pasadena, and other suburbs of Los Angeles, those strategies go out the window. A fire break starves a fire by introducing a gap in its fuel; it can be a cleared strip of vegetation, a river, or even a freeway. But you can’t just hack a fire break through a neighborhood. “Now you’re having to use big fire engines and spray lots of water,” Scopa said, compared to the wildlands where “we do a lot of firefighting without water.”
Water has already proven to be a significant issue in Los Angeles, where many hydrants near Palisades, the biggest of the five fires, had already gone dry by 3:00 a.m. Wednesday. “We’re fighting a wildfire with urban water systems, and that is really challenging,” Los Angeles Department of Water and Power CEO Janisse Quiñones explained in a news conference later that same day.
LADWP said it had filled its 114 water storage tanks before the fires started, but the city’s water supply was never intended to stop a 17,000-acre fire. The hydrants are “meant to put out a two-house fire, a one-house fire, or something like that,” Faith Kearns, a water and wildfire researcher at Arizona State University, told me. Additionally, homeowners sometimes leave their sprinklers on in the hopes that it will help protect their house, or try to fight fires with their own hoses. At a certain point, the system — just like the city personnel — becomes overwhelmed by the sheer magnitude of the unfolding disaster.
Making matters worse is the wind, which restricted some of the aerial support firefighters typically employ. As gusts slowed on Thursday, retardant and water drops were able to resume, helping firefighters in their efforts. (The Eaton Fire, while still technically 0% contained because there are no established fire lines, has “significantly stopped” growing, The New York Times reports). Still, firefighters don’t typically “paint” neighborhoods; the drops, which don’t put out fires entirely so much as suppress them enough that firefighters can fight them at close range, are a liability. Kearns, however, told me that “the winds were so high, they weren’t able to do the water drops that they normally do and that are an enormous part of all fire operations,” and that “certainly compounded the problems of the fire hydrants running dry.”
Firefighters’ priority isn’t saving structures, though. “Firefighters save lives first before they have to deal with fire,” Alexander Maranghides, a fire protection engineer at the National Institute of Standards and Technology and the author of an ongoing case study of the 2018 Camp fire in Paradise, California, told me. That can be an enormous and time-consuming task in a dense area like suburban Los Angeles, and counterintuitively lead to more areas burning down. Speaking specifically from his conclusions about the Camp fire, which was similarly a wildland-urban interface, or WUI fire, Maranghides added, “It is very, very challenging because as things deteriorate — you’re talking about downed power lines, smoke obstructing visibility, and you end up with burn-overs,” when a fire moves so quickly that it overtakes people or fire crews. “And now you have to go and rescue those civilians who are caught in those burn-overs.” Sometimes, that requires firefighters to do triage — and let blocks burn to save lives.
Perhaps most ominously, the problems don’t end once the fire is out. When a house burns down, it is often the case that its water pipes burst. (This also adds to the water shortage woes during the event.) But when firefighters are simultaneously pumping water out of other parts of the system, air can be sucked down into those open water pipes. And not just any air. “We’re not talking about forest smoke, which is bad; we’re talking about WUI smoke, which is bad plus,” Maranghides said, again referring to his research in Paradise. “It’s not just wood burning; it’s wood, plastics, heavy metals, computers, cars, batteries, everything. You don’t want to be breathing it, and you don’t want it going into your water system.”
Water infrastructure can be damaged in other ways, as well. Because fires are burning “so much hotter now,” Kearns told me, contamination can occur due to melting PVC piping, which releases benzene, a carcinogen. Watersheds and reservoirs are also in danger of extended contamination, particularly once rains finally do come and wash soot, silt, debris, and potentially toxic flame retardant into nearby streams.
But that’s a problem for the future. In the meantime, Los Angeles — and lots of it — continues to burn.
“I don’t care how many resources you have; when the fires are burning like they do when we have Santa Anas, there’s so little you can do,” Scopa said. “All you can do is try to protect the people and get the people out, and try to keep your firefighters safe.”
Plus 3 more outstanding questions about this ongoing emergency.
As Los Angeles continued to battle multiple big blazes ripping through some of the most beloved (and expensive) areas of the city on Thursday, a question lingered in the background: What caused the fires in the first place?
Though fires are less common in California during this time of the year, they aren’t unheard of. In early December 2017, power lines sparked the Thomas Fire near Ventura, California, which burned through to mid-January. At the time it was the largest fire in the state since at least the 1930s. Now it’s the ninth-largest. Although that fire was in a more rural area, it ignited for some of the same reasons we’re seeing fires this week.
Read on for everything we know so far about how the fires started.
Five major fires started during the Santa Ana wind event this week:
Officials have not made any statements about the cause of any of the fires yet.
On Thursday morning, Edward Nordskog, a retired fire investigator from the Los Angeles Sheriff’s Department, told me it was unlikely they had even begun looking into the root of the biggest and most destructive of the fires in the Pacific Palisades. “They don't start an investigation until it's safe to go into the area where the fire started, and it just hasn't been safe until probably today,” he said.
It can take years to determine the cause of a fire. Investigators did not pinpoint the cause of the Thomas Fire until March 2019, more than two years after it started.
But Nordskog doesn’t think it will take very long this time. It’s easier to narrow down the possibilities for an urban fire because there are typically both witnesses and surveillance footage, he told me. He said the most common causes of wildfires in Los Angeles are power lines and those started by unhoused people. They can also be caused by sparks from vehicles or equipment.
At about 27,000 acres burned, these fires are unlikely to make the charts for the largest in California history. But because they are burning in urban, densely populated, and expensive areas, they could be some of the most devastating. With an estimated 2,000 structures damaged so far, the Eaton and Palisades fires are likely to make the list for most destructive wildfire events in the state.
And they will certainly be at the top for costliest. The Palisades Fire has already been declared a likely contender for the most expensive wildfire in U.S. history. It has destroyed more than 1,000 structures in some of the most expensive zip codes in the country. Between that and the Eaton Fire, Accuweather estimates the damages could reach $57 billion.
While we don’t know the root causes of the ignitions, several factors came together to create perfect fire conditions in Southern California this week.
First, there’s the Santa Ana winds, an annual phenomenon in Southern California, when very dry, high-pressure air gets trapped in the Great Basin and begins escaping westward through mountain passes to lower-pressure areas along the coast. Most of the time, the wind in Los Angeles blows eastward from the ocean, but during a Santa Ana event, it changes direction, picking up speed as it rushes toward the sea.
Jon Keeley, a research scientist with the US Geological Survey and an adjunct professor at the University of California, Los Angeles told me that Santa Ana winds typically blow at maybe 30 to 40 miles per hour, while the winds this week hit upwards of 60 to 70 miles per hour. “More severe than is normal, but not unique,” he said. “We had similar severe winds in 2017 with the Thomas Fire.”
Second, Southern California is currently in the midst of extreme drought. Winter is typically a rainier season, but Los Angeles has seen less than half an inch of rain since July. That means that all the shrubland vegetation in the area is bone-dry. Again, Keeley said, this was not usual, but not unique. Some years are drier than others.
These fires were also not a question of fuel management, Keeley told me. “The fuels are not really the issue in these big fires. It's the extreme winds,” he said. “You can do prescription burning in chaparral and have essentially no impact on Santa Ana wind-driven fires.” As far as he can tell, based on information from CalFire, the Eaton Fire started on an urban street.
While it’s likely that climate change played a role in amplifying the drought, it’s hard to say how big a factor it was. Patrick Brown, a climate scientist at the Breakthrough Institute and adjunct professor at Johns Hopkins University, published a long post on X outlining the factors contributing to the fires, including a chart of historic rainfall during the winter in Los Angeles that shows oscillations between very wet and very dry years over the past eight decades. But climate change is expected to make dry years drier in Los Angeles. “The LA area is about 3°C warmer than it would be in preindustrial conditions, which (all else being equal) works to dry fuels and makes fires more intense,” Brown wrote.