You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
A conversation with the most interesting man on the Federal Energy Regulatory Commission.
It’s not every day that a top regulator calls into question the last few decades of policy in the area they help oversee. But that’s exactly what Mark Christie, a commissioner on the Federal Energy Regulatory Commission, the interstate power regulator, did earlier this year.
In a paper enticingly titled “It’s Time To Reconsider Single-Clearing Price Mechanisms in U.S. Energy Markets,” Christie gave a history of deregulation in the electricity markets and suggested it may have been a mistake.
While criticisms of deregulation are by no means new, that they were coming from a FERC commissioner was noteworthy — a Republican no less. While there is not yet a full-scale effort to reverse deregulation in the electricity markets, which has been going on since the 1990s, there is a rising tide of skepticism of how electricity markets do — and don’t — reward reliability, let alone the effect they have on consumer prices.
Christie’s criticisms have a conservative bent, as you’d expect from someone who was nominated by former President Donald Trump to the bipartisan commission. He is very concerned about existing generation going offline and has called activist drives against natural gas pipelines and other transportation infrastructure for the fossil-fuel-emitting power sources a “national campaign of legal warfare…[that] has prevented the construction of vitally needed natural gas transportation infrastructure.”
Since renewables have become, at times, among the world’s cheapest sources of energy and thus quite competitive in deregulated markets with fossil fuels (especially when subsidized), this kind of skepticism is a growing issue in the Republican Party, which has deep ties to oil and gas companies. The Texas state legislature, for instance, responded to Winter Storm Uri, which almost destroyed Texas’ electricity grid in 2021, with its own version of central planning: billions in low cost loans for the construction of new gas-fired power plants. Former Texas Governor Rick Perry, as secretary of energy in the Trump administration, even proposed to FERC a plan to explicitly subsidize coal and nuclear plants, citing reliability concerns. (FERC rejected it.) Some regions that didn’t embrace deregulation, like the Southeast and Southwest, also have some of the most carbon-intensive grids.
But Christie is not so much a critic of renewable resources like wind and solar, per se, as he is very focused on the benefits to the grid of ample “dispatchable” resources, i.e. power sources that can power up and down on demand.
This doesn’t have to mean uncritical acceptance of existing fossil fuel infrastructure. The idea that markets don’t reward reliability enough can help explain the poor winterization for fossil fuel generation that was so disastrous during Winter Storm Uri. And in California, the recognition that renewables alone can’t power the grid 24 hours a day has led to a massive investment in energy storage, which can help approximate the on-demand nature of natural gas or coal without the carbon pollution.
But Christie is primarily interested in the question of just how the planning is done for a system that links together electric generation and consumers. He criticized the deregulated system in much of the country where power is generated by companies separate from the utilities that ultimately sell and distribute that power to customers and where states have less of a role in overall planning, despite ultimately approving electricity rates.
Instead, these markets for power are mediated through a system where utilities pay independent generators a single price for their power at a given time that is arrived at through bidding, often in the context of sprawling multi-state regional transmission organizations like PJM Interconnection, which covers a large swath of the Midwest and Mid-Atlantic region, or the New England Independent System Operator. He says this set-up doesn’t do enough to incentivize dispatchable power, which only comes online when demand spikes, thus making the system overall less reliable, while still showing little evidence that costs have gone down for consumers.
Every year, grid operators and their regulators — including Christie — warn of reliability issues. What Christie argues is that these reliability issues may be endemic to the deregulated system.
Here is where there could be common ground between advocates for an energy transition and conservative deregulation skeptics like Christie. While the combination of deregulation and subsidies has been great for getting solar and wind from zero to around 13 percent of the nation’s utility-scale electricity generation, any truly decarbonized grid will likely require intensive government supervision and planning. Ultimately, political authorities who are guiding the grid to be less carbon-intensive will be responsible for keeping the lights on no matter how cold, warm, sunny, or windy it happens to be. And that may not be something today’s electricity “markets” are up for.
I spoke with Christie in late June about how FERC gave us the electricity market we have today, why states might be better managers than markets, and what he’s worried about this summer. Our conversation has been edited for length and clarity.
What happened to our energy markets in the 1990s and 2000s where you think things started to go wrong?
In the late ‘90s, we had this big push called deregulation. And as I pointed out in the article, it really wasn’t “deregulation” in the sense that in the ‘70s, you know, the trucking and airlines and railroads were deregulated where you remove government price regulation and you let the market set the prices. That’s not what happened. It really was just a change of the price-setting construct and the regulatory construct.
It took what had been the most common form of regulation of utilities, where utilities are considered to be natural monopolies, and said we’re going to restructure these utilities and we’re going to let the generation part compete in these regional markets.
And, you know, from an economic standpoint, okay, so far so good. But there’s been a lot of questioning as to whether there’s really true competition. Many parts of the country also just didn’t do it.
I think there’s a serious question whether that’s benefiting consumers more than the cost of service model where state regulators set the prices.
So if I’m an electricity consumer in one of the markets that’s more or less deregulated, how might reliability become an issue in my own home?
First of all, when you’re in one of these areas that are deregulated, essentially you’re paying the gas price. If it goes up, that’s what you’re going to pay. If it goes down, it looks really good.
But from the reliability standpoint, the question is whether these markets are procuring enough resources to make sure you have the power to keep your lights on 24/7. That is the big question to a consumer in a so-called deregulated state: Are these markets, which are now the main vehicle for buying generation resources, are they getting enough generation resources to make sure that your lights stay on, your heat stays on, and your air conditioning stays on?
Do you think there’s evidence that these deregulated markets are doing a worse job at that kind of procurement?
Well, let’s take, for example, PJM, which came out with an announcement in February that said they were going to lose in the next five years over 40 gigawatts. A gig is 1,000 megawatts, so that’s a lot of power, that’s a lot of generating resources. And the independent market monitor actually has told me it is closer to 50 gigawatts. So all these units are going to retire and they’re going to retire largely for economic reasons. They’re not getting sufficient compensation to stay open.
The essence of restructuring was that generating units are going to have to make their money in the market. They’re not going to get funding through what's called the “rate base,” which is the regulated, traditional cost-of-service model. They have to get it in the markets and theoretically, that sounds good.
But in reality, if they can’t get enough money to pay their cost, they’re going to retire and then you don’t have those resources. Particularly in the RTOs [regional transmission organizations, i.e. the multi-state electricity markets], you’re seeing these markets result in premature retirements of generating resources. And so, now, why is that? It’s more of a problem in the RTOS than non-RTOS because in the non-RTOS, they procure resources under the supervision of a state regulator through what’s called an integrated resource plan or IRP.
The reason I think the advantage and reliability is with the non-RTOS is that those utilities have to prove to a state regulator that their resource plan makes sense, that they’re planning to buy generating resources. Whether they’re buying wind or solar or gas, whatever, they have to go to a state regulator and say, “Here’s our plan” and then seek approval from that regulator. And if they’re shutting down units, the state regulator can say, “Wait a minute, you’re shutting down units that a few years ago you told us were needed for reliability, and now you’re telling us you want to shut them down.” So the state regulator can actually say , “No, you’re not going to shut that unit down. You’re going to keep running it.”
That’s why I think you have more accountability in the non-RTOS because the state regulators can tell the utility, “you need more resources, go build it or buy it,” or “you already have resources, you’re not going to shut them down, we’re not going to let you.”
You don’t have that in an RTO. In an RTO, it’s all done through the market. The market decides, to the extent it has a mind. You know, it’s all the result of market operations. It’s not anybody saying whether it’s a good idea or not for a certain unit to shut down.
I find it interesting that a lot of the criticism of the deregulated system — and a lot of places that are not deregulated — come from more conservative states that would generally not think of themselves as having this kind of strong state role in economic policy. What’s different about electricity? Why do you think the politics of this line up differently than it would on other issues?
I don’t know. That’s an interesting question. I haven’t even thought about it in those terms.
I think it goes back to when deregulation took place in the mid-to-late ‘90s. Other than Texas, which went all the way, the states that probably went farthest on it were in the Northeast. Part of the reason why is because they already had very high consumer prices. I think deregulation was definitely sold as a way to reduce prices to consumers. It hasn’t worked out that way.
Whereas you look at the Southeast, which never went in for deregulation. The Southeastern states, which are still non-RTO states, had relatively very low rates, so they didn’t see a problem to be fixed.
The other big trend since the 1990s and 2000s is the explosive growth of renewables, especially wind and solar. Is there something about deregulated electricity markets, the RTO system, that makes those types of resources economically more favorable than they would be under a different system?
Well, if you’re getting a very high subsidy, like wind and solar are getting, it means you can bid into the energy markets effectively at zero. So if you can bid in at zero offering, you’re virtually guaranteed to be a winner. In a non-RTO state, a state that's doing it through an integrated resource plan, the state regulator reviews the plan. That's why I think an IRP approach is better actually for implementing wind and solar because you can implement and deploy wind and solar as part of an integrated plan that includes enough balancing resources to make sure you keep the lights on.
To me an Integrated Resource Plan is a holistic process, where you can look at all the resources at your disposal: wind, solar, gas, as well as the demand side. And you can balance them all in a way that you think, “Okay, this balance is appropriate for us for the next three years, or four years, or five years.” Because you’re typically doing an IRP every three to five years anyway. And so I think it’s a good way to make sure you balance these resources.
In a market there’s no balancing. In a market it’s just winners and losers. And so wind and solar are almost always going to win because they have such massive subsidies that they’re going to get to offer in at a bid price of zero. The problem with that is they’re not going to get paid zero. They’re going to get paid the highest price [that all electricity suppliers get]. So they offer in at zero, but they get paid the highest price, which is going to be a gas price. It’s probably going to be the last gas unit to clear, that’s usually the one that’s the highest price unit. And yet because of the single clearing price mechanism, everybody gets that price. So you can offer it at zero to guarantee you clear, but then you’re going to get the highest price, usually a gas combustion turbine peaker.
Do you think we would see as much wind and solar on the grid if it weren’t for the fact that a lot of the resources are benefiting from the pricing mechanism you describe?
I don’t think you can draw that conclusion because there are non-RTO states that have what’s called a mandatory RPS, mandatory renewable portfolio standard. And so you can get there through a mandatory RPS and a cost to service model just as you can end up in a market. And actually, again, I think you can get there in a more balanced way to make sure that the reliability is not being threatened in the meantime.
To get back to what we’re talking about in the beginning, my understanding is that FERC, where you are now, played a large role in encouraging deregulation in the formation of RTOs. Is this something that your staff or other commissioners disagree with you about? How do you see the role you’re playing, where you’re doing public advocacy and reshaping this conversation around deregulation?
First of all, we always have to give the standard disclaimer, you never talk about a pending case. But FERC was really the driving force behind a lot of this deregulation. So obviously, they decided that that’s what they wanted to push, and they did. And so I think it’s appropriate as a FERC regulator to raise questions. I think raising questions about the status quo is an important thing that we do and should do. Ultimately, you advocate for what you think it ought to be and if the votes come eventually, it might take several years, but it’s important.
One of the things I try to do is, I put the consumer at the center of everything I do. It is absolutely my priority. And I think that it should be every regulator’s priority, particularly in the electric area because most consumers in America — in fact, almost all consumers in America — are captive customers. By captive. I mean, they don’t get to choose their electric supplier.
Like, where do you live, Matthew?
I live in New York City.
You don’t get to choose, right? You’re getting electricity from ConEd. And you don’t have any choice. So you’re a captive customer. And most consumers in America are captive customers. We tried this retail choice in a few states that didn’t work. You know, they’re still doing it. I’m not going to say whether it’s working or not, but I know we tried it in Virginia, and it didn’t work at all because of a lot of reasons.
I always put customers first and say, “Look, these customers are captive. We have to protect them. We have to protect the captive customers by making sure they’re not getting overcharged.” So that’s why I care about these issues. And that’s why I wrote this article. I think that customers in a lot of ways in America are not getting treated fairly. They’re getting overcharged and I think they’re not getting what they should be getting. And so I think a big part of it is some of this stuff that FERC's been pushing for the last 25 years.
Our time is running out. So I will leave with a question that is topical: It’s already been quite hot in Texas, but outside of Texas and in FERC-land, where are you concerned about reliability issues this summer?
Well, I’m concerned about everywhere. It’s not a flippant remark. I read very closely the reliability reports that we get from NERC and we have reliability challenges in many, many places. It’s not just in the RTOs. I think we have reliability challenges in the South. Fortunately, the West this year, which has been a problem the last couple of years, is actually looking pretty good because all the rain last winter — even flooding — really was great for hydropower.
I’m from California, and I think it’s the first time in my adult life that I remember stories about dams being 100 percent, if not more than 100 percent, full.
The rains and snowfall were so needed. It’s filled up reservoirs that have been really dry for years. And from an electrical standpoint, it’s been really good for hydro. So they’re looking at really good hydro availability this summer in ways they haven't been for the last several years. So the West actually, because of all the rain and the greater available of hydro, I think is in fairly good shape.
There’s a problem in California with the duck curve, the problem is still there. If you have such a high solar content, when the sun goes down, obviously the solar stops generating and so what do you do you know for the next four to five hours? Because the air conditioners are still running, it’s still hot, but that solar production has just dropped off the table. So they’ve been patching with some battery storage and some gas backup.
But I’m worried about everywhere. I watch very closely the reports that come out of the RTOs and you can’t be shutting down dispatchable resources at the rate we’re doing when you’re not replacing them one to one with wind or solar. The arithmetic doesn’t work and it’s going to catch up to us at some point.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Recovering from the Los Angeles wildfires will be expensive. Really expensive. Insurance analysts and banks have already produced a wide range of estimates of both what insurance companies will pay out and overall economic loss. AccuWeatherhas put out an eye-catching preliminary figure of $52 billion to $57 billion for economic losses, with the service’s chief meteorologist saying that the fires have the potential to “become the worst wildfire in modern California history based on the number of structures burned and economic loss.” On Thursday, J.P. Morgan doubled its previous estimate for insured losses to $20 billion, with an economic loss figure of $50 billion — about the gross domestic product of the country of Jordan.
The startlingly high loss figures from a fire that has only lasted a few days and is (relatively) limited in scope show just how distinctly devastating an urban fire can be. Enormous wildfires thatcover millions of acres like the 2023 Canadian wildfires can spew ash and particulate matter all over the globe and burn for months, darkening skies and clogging airways in other countries. And smaller — and far deadlier fires — than those still do not produce the same financial roll.
It’s in coastal Southern California where you find large population centers areas known by all to be at extreme risk of fire. And so a fire there can destroy a whole neighborhood in a few hours and put the state’s insurance system into jeopardy.
One reason why the projected economic impacts of the fires are so high is that the structures that have burned and the land those structures sit on are very valuable. Pacific Palisades, Malibu, and Santa Monica contain some of the most sought-after real estate on planet earth, with typical home prices over $2 million. Pacific Palisades itself has median home values of around $3 million, according to JPMorgan Chase.
The AccuWeather estimates put the economic damage for the Los Angeles fires at several times previous large, urban fires — the Maui wildfire in 2023 was estimated to cause around $14 billion of economic loss, for example — while the figure would be about a third or a quarter of a large hurricane, which tend to strike areas with millions of people in them across several states.
“The fires have not been contained thus far and continue to spread, implying that estimates of potential economic and insured losses are likely to increase,” the JPMorgan analysts wrote Thursday.
That level of losses would make the fires costlier in economic terms than the 2018 Butte County Camp Fire, whose insured losses of $10 billion made it California’s costliest at the time. That fire was far larger than the Los Angeles fires, spreading over 150,000 acres compared to just over 17,000 acres for the Palisades Fire and over 10,000 acres for the Eaton Fire. It also led to more than 80 deaths in the town of Paradise.
So far, around 2,000 homes have been destroyed,according to the Los Angeles Times,a fraction of the more than 19,000 structures affected by the Camp Fire. The difference in estimated losses comes from the fact that homes in Pacific Palisades weigh in at more than six times those in rural Butte, according to JPMorgan.
While insured losses get the lion’s share of attention when it comes to the cost impacts of a natural disaster, the potential damages go far beyond the balance sheet of insurers.
For one, it’s likely that many affected homeowners did not even carry insurance, either because their insurers failed to renew their existing policies or the homeowners simply chose to go without due to the high cost of what insurance they could find. “A larger than usual portion of the losses caused by the wildfires will be uninsured,” according to Morningstar DBRS, which estimated total insured losses at more than $8 billion. Many homeowners carry insurance from California’s backup FAIR Plan, which may itself come under financial pressure, potentially leading to assessments from the state’s policyholders to bolster its ability to pay claims.
AccuWeather arrived at its economic impact figure by looking not just at losses from property damage but also wages that go unearned due to economic activity slowing down or halting in affected areas, infrastructure that needs to be repaired, supply chain issues, and transportation snarls. Even when homes and businesses aren’t destroyed, people may be unable to work due to evacuations; businesses may close due to the dispersal of their customers or inability of their suppliers to make deliveries. Smoke inhalation can lead to short-, medium-, and long-term health impacts that take a dent out of overall economic activity.
The high level of insured losses, meanwhile, could mean that insurers’ will see less surplus and could have to pay more for reinsurance, Nancy Watkins, an actuary and wildfire expert at Milliman, told me in an email. This may mean that they would have to shed yet more policies “in order to avoid deterioration in their financial strength ratings,” just as California has been trying to lure insurers back with reforms to its dysfunctional insurance market.
The economic costs of the fire will likely be felt for years if not decades. While it would take an act of God far stronger than a fire to keep people from building homes on the slopes of the Santa Monica Mountains or off the Pacific Coast, the city that rebuilds may be smaller, more heavily fortified, and more expensive than the one that existed at the end of last year. And that’s just before the next big fire.
Suburban streets, exploding pipes, and those Santa Ana winds, for starters.
A fire needs three things to burn: heat, fuel, and oxygen. The first is important: At some point this week, for a reason we have yet to discover and may never will, a piece of flammable material in Los Angeles County got hot enough to ignite. The last is essential: The resulting fires, which have now burned nearly 29,000 acres, are fanned by exceptionally powerful and dry Santa Ana winds.
But in the critical days ahead, it is that central ingredient that will preoccupy fire managers, emergency responders, and the public, who are watching their homes — wood-framed containers full of memories, primary documents, material wealth, sentimental heirlooms — transformed into raw fuel. “Grass is one fuel model; timber is another fuel model; brushes are another — there are dozens of fuel models,” Bobbie Scopa, a veteran firefighter and author of the memoir Both Sides of the Fire Line, told me. “But when a fire goes from the wildland into the urban interface, you’re now burning houses.”
This jump from chaparral shrubland into neighborhoods has frustrated firefighters’ efforts to gain an upper hand over the L.A. County fires. In the remote wilderness, firefighters can cut fire lines with axes, pulaskis, and shovels to contain the blaze. (A fire’s “containment” describes how much firefighters have encircled; 25% containment means a quarter of the fire perimeter is prevented from moving forward by manmade or natural fire breaks.)
Once a fire moves into an urban community and starts spreading house to house, however, as has already happened in Santa Monica, Pasadena, and other suburbs of Los Angeles, those strategies go out the window. A fire break starves a fire by introducing a gap in its fuel; it can be a cleared strip of vegetation, a river, or even a freeway. But you can’t just hack a fire break through a neighborhood. “Now you’re having to use big fire engines and spray lots of water,” Scopa said, compared to the wildlands where “we do a lot of firefighting without water.”
Water has already proven to be a significant issue in Los Angeles, where many hydrants near Palisades, the biggest of the five fires, had already gone dry by 3:00 a.m. Wednesday. “We’re fighting a wildfire with urban water systems, and that is really challenging,” Los Angeles Department of Water and Power CEO Janisse Quiñones explained in a news conference later that same day.
LADWP said it had filled its 114 water storage tanks before the fires started, but the city’s water supply was never intended to stop a 17,000-acre fire. The hydrants are “meant to put out a two-house fire, a one-house fire, or something like that,” Faith Kearns, a water and wildfire researcher at Arizona State University, told me. Additionally, homeowners sometimes leave their sprinklers on in the hopes that it will help protect their house, or try to fight fires with their own hoses. At a certain point, the system — just like the city personnel — becomes overwhelmed by the sheer magnitude of the unfolding disaster.
Making matters worse is the wind, which restricted some of the aerial support firefighters typically employ. As gusts slowed on Thursday, retardant and water drops were able to resume, helping firefighters in their efforts. (The Eaton Fire, while still technically 0% contained because there are no established fire lines, has “significantly stopped” growing, The New York Times reports). Still, firefighters don’t typically “paint” neighborhoods; the drops, which don’t put out fires entirely so much as suppress them enough that firefighters can fight them at close range, are a liability. Kearns, however, told me that “the winds were so high, they weren’t able to do the water drops that they normally do and that are an enormous part of all fire operations,” and that “certainly compounded the problems of the fire hydrants running dry.”
Firefighters’ priority isn’t saving structures, though. “Firefighters save lives first before they have to deal with fire,” Alexander Maranghides, a fire protection engineer at the National Institute of Standards and Technology and the author of an ongoing case study of the 2018 Camp fire in Paradise, California, told me. That can be an enormous and time-consuming task in a dense area like suburban Los Angeles, and counterintuitively lead to more areas burning down. Speaking specifically from his conclusions about the Camp fire, which was similarly a wildland-urban interface, or WUI fire, Maranghides added, “It is very, very challenging because as things deteriorate — you’re talking about downed power lines, smoke obstructing visibility, and you end up with burn-overs,” when a fire moves so quickly that it overtakes people or fire crews. “And now you have to go and rescue those civilians who are caught in those burn-overs.” Sometimes, that requires firefighters to do triage — and let blocks burn to save lives.
Perhaps most ominously, the problems don’t end once the fire is out. When a house burns down, it is often the case that its water pipes burst. (This also adds to the water shortage woes during the event.) But when firefighters are simultaneously pumping water out of other parts of the system, air can be sucked down into those open water pipes. And not just any air. “We’re not talking about forest smoke, which is bad; we’re talking about WUI smoke, which is bad plus,” Maranghides said, again referring to his research in Paradise. “It’s not just wood burning; it’s wood, plastics, heavy metals, computers, cars, batteries, everything. You don’t want to be breathing it, and you don’t want it going into your water system.”
Water infrastructure can be damaged in other ways, as well. Because fires are burning “so much hotter now,” Kearns told me, contamination can occur due to melting PVC piping, which releases benzene, a carcinogen. Watersheds and reservoirs are also in danger of extended contamination, particularly once rains finally do come and wash soot, silt, debris, and potentially toxic flame retardant into nearby streams.
But that’s a problem for the future. In the meantime, Los Angeles — and lots of it — continues to burn.
“I don’t care how many resources you have; when the fires are burning like they do when we have Santa Anas, there’s so little you can do,” Scopa said. “All you can do is try to protect the people and get the people out, and try to keep your firefighters safe.”
Plus 3 more outstanding questions about this ongoing emergency.
As Los Angeles continued to battle multiple big blazes ripping through some of the most beloved (and expensive) areas of the city on Thursday, a question lingered in the background: What caused the fires in the first place?
Though fires are less common in California during this time of the year, they aren’t unheard of. In early December 2017, power lines sparked the Thomas Fire near Ventura, California, which burned through to mid-January. At the time it was the largest fire in the state since at least the 1930s. Now it’s the ninth-largest. Although that fire was in a more rural area, it ignited for some of the same reasons we’re seeing fires this week.
Read on for everything we know so far about how the fires started.
Five major fires started during the Santa Ana wind event this week:
Officials have not made any statements about the cause of any of the fires yet.
On Thursday morning, Edward Nordskog, a retired fire investigator from the Los Angeles Sheriff’s Department, told me it was unlikely they had even begun looking into the root of the biggest and most destructive of the fires in the Pacific Palisades. “They don't start an investigation until it's safe to go into the area where the fire started, and it just hasn't been safe until probably today,” he said.
It can take years to determine the cause of a fire. Investigators did not pinpoint the cause of the Thomas Fire until March 2019, more than two years after it started.
But Nordskog doesn’t think it will take very long this time. It’s easier to narrow down the possibilities for an urban fire because there are typically both witnesses and surveillance footage, he told me. He said the most common causes of wildfires in Los Angeles are power lines and those started by unhoused people. They can also be caused by sparks from vehicles or equipment.
At about 27,000 acres burned, these fires are unlikely to make the charts for the largest in California history. But because they are burning in urban, densely populated, and expensive areas, they could be some of the most devastating. With an estimated 2,000 structures damaged so far, the Eaton and Palisades fires are likely to make the list for most destructive wildfire events in the state.
And they will certainly be at the top for costliest. The Palisades Fire has already been declared a likely contender for the most expensive wildfire in U.S. history. It has destroyed more than 1,000 structures in some of the most expensive zip codes in the country. Between that and the Eaton Fire, Accuweather estimates the damages could reach $57 billion.
While we don’t know the root causes of the ignitions, several factors came together to create perfect fire conditions in Southern California this week.
First, there’s the Santa Ana winds, an annual phenomenon in Southern California, when very dry, high-pressure air gets trapped in the Great Basin and begins escaping westward through mountain passes to lower-pressure areas along the coast. Most of the time, the wind in Los Angeles blows eastward from the ocean, but during a Santa Ana event, it changes direction, picking up speed as it rushes toward the sea.
Jon Keeley, a research scientist with the US Geological Survey and an adjunct professor at the University of California, Los Angeles told me that Santa Ana winds typically blow at maybe 30 to 40 miles per hour, while the winds this week hit upwards of 60 to 70 miles per hour. “More severe than is normal, but not unique,” he said. “We had similar severe winds in 2017 with the Thomas Fire.”
Second, Southern California is currently in the midst of extreme drought. Winter is typically a rainier season, but Los Angeles has seen less than half an inch of rain since July. That means that all the shrubland vegetation in the area is bone-dry. Again, Keeley said, this was not usual, but not unique. Some years are drier than others.
These fires were also not a question of fuel management, Keeley told me. “The fuels are not really the issue in these big fires. It's the extreme winds,” he said. “You can do prescription burning in chaparral and have essentially no impact on Santa Ana wind-driven fires.” As far as he can tell, based on information from CalFire, the Eaton Fire started on an urban street.
While it’s likely that climate change played a role in amplifying the drought, it’s hard to say how big a factor it was. Patrick Brown, a climate scientist at the Breakthrough Institute and adjunct professor at Johns Hopkins University, published a long post on X outlining the factors contributing to the fires, including a chart of historic rainfall during the winter in Los Angeles that shows oscillations between very wet and very dry years over the past eight decades. But climate change is expected to make dry years drier in Los Angeles. “The LA area is about 3°C warmer than it would be in preindustrial conditions, which (all else being equal) works to dry fuels and makes fires more intense,” Brown wrote.