You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
Plus 3 more outstanding questions about this ongoing emergency.

As Los Angeles continued to battle multiple big blazes ripping through some of the most beloved (and expensive) areas of the city on Friday, a question lingered in the background: What caused the fires in the first place?
Though fires are less common in California during this time of the year, they aren’t unheard of. In early December 2017, power lines sparked the Thomas Fire near Ventura, California, which burned through to mid-January. At the time it was the largest fire in the state since at least the 1930s. Now it’s the ninth-largest. Although that fire was in a more rural area, it ignited for some of the same reasons we’re seeing fires this week.
Read on for everything we know so far about how the fires started.
Six major fires started during the Santa Ana wind event last week:
Officials are investigating the cause of the fires and have not made any public statements yet. Early eyewitness accounts suggest that the Eaton Fire may have started at the base of a transmission tower owned by Southern California Edison. So far, the company has maintained that an analysis of its equipment showed “no interruptions or electrical or operational anomalies until more than one hour after the reported start time of the fire.” A Washington Post investigation found that the Palisades Fire could have risen from the remnants of a fire that burned on New Year’s Eve and reignited.
On Thursday morning, Edward Nordskog, a retired fire investigator from the Los Angeles Sheriff’s Department, told me it was unlikely they had even begun looking into the root of the biggest and most destructive of the fires in the Pacific Palisades. “They don't start an investigation until it's safe to go into the area where the fire started, and it just hasn't been safe until probably today,” he said.
It can take years to determine the cause of a fire. Investigators did not pinpoint the cause of the Thomas Fire until March 2019, more than two years after it started.
But Nordskog doesn’t think it will take very long this time. It’s easier to narrow down the possibilities for an urban fire because there are typically both witnesses and surveillance footage, he told me. He said the most common causes of wildfires in Los Angeles are power lines and those started by unhoused people. They can also be caused by sparks from vehicles or equipment.
At more than 40,000 acres burned total, these fires are unlikely to make the charts for the largest in California history. But because they are burning in urban, densely populated, and expensive areas, they could be some of the most devastating. With an estimated 9,000 structures damaged as of Friday morning, the Eaton and Palisades fires are likely to make the list for most destructive wildfire events in the state.
And they will certainly be at the top for costliest. The Palisades Fire has already been declared a likely contender for the most expensive wildfire in U.S. history. It has destroyed more than 5,000 structures in some of the most expensive zip codes in the country. Between that and the Eaton Fire, Accuweather estimates the damages could reach $57 billion.
While we don’t know the root causes of the ignitions, several factors came together to create perfect fire conditions in Southern California this week.
First, there’s the Santa Ana winds, an annual phenomenon in Southern California, when very dry, high-pressure air gets trapped in the Great Basin and begins escaping westward through mountain passes to lower-pressure areas along the coast. Most of the time, the wind in Los Angeles blows eastward from the ocean, but during a Santa Ana event, it changes direction, picking up speed as it rushes toward the sea.
Jon Keeley, a research scientist with the US Geological Survey and an adjunct professor at the University of California, Los Angeles told me that Santa Ana winds typically blow at maybe 30 to 40 miles per hour, while the winds this week hit upwards of 60 to 70 miles per hour. “More severe than is normal, but not unique,” he said. “We had similar severe winds in 2017 with the Thomas Fire.”
Second, Southern California is currently in the midst of extreme drought. Winter is typically a rainier season, but Los Angeles has seen less than half an inch of rain since July. That means that all the shrubland vegetation in the area is bone-dry. Again, Keeley said, this was not usual, but not unique. Some years are drier than others.
These fires were also not a question of fuel management, Keeley told me. “The fuels are not really the issue in these big fires. It's the extreme winds,” he said. “You can do prescription burning in chaparral and have essentially no impact on Santa Ana wind-driven fires.” As far as he can tell, based on information from CalFire, the Eaton Fire started on an urban street.
While it’s likely that climate change played a role in amplifying the drought, it’s hard to say how big a factor it was. Patrick Brown, a climate scientist at the Breakthrough Institute and adjunct professor at Johns Hopkins University, published a long post on X outlining the factors contributing to the fires, including a chart of historic rainfall during the winter in Los Angeles that shows oscillations between wet and dry years over the past eight decades.
But climate change is expected to make dry years drier and wet years wetter, creating a “hydroclimate whiplash,” as Daniel Swain, a pre-eminent expert on climate change and weather in California puts it. In a thread on Bluesky, Swain wrote that “in 2024, Southern California experienced an exceptional episode of wet-to-dry hydroclimate whiplash.” Last year’s rainy winter fostered abundant plant growth, and the proceeding dryness primed the vegetation for fire.
Get our best story delivered to your inbox every day:
Editor’s note: This story was last update on Monday, January 13, at 10:00 a.m. ET.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
On diesel backup generators, Chinese rare earths, and geothermal milestones
Current conditions: A polar vortex is sending Arctic air across the Upper Midwest and Northeast, bringing more than a foot of snow to parts of Michigan • In the Pacific Northwest, an atmospheric river is set to bring rain showers on the coast and snow inland • The death toll from flooding across Southeast Asia has surpassed 1,300.
The Department of Transportation is poised to significantly weaken fuel efficiency requirements for tens of millions of new cars and light trucks, President Donald Trump announced Wednesday. Heatmap's Robinson Meyer explained: “The United States essentially has two ways to regulate pollution from cars and light trucks: It can limit greenhouse gas emissions from new cars and trucks, and it can require the fuel economy from new vehicles to get a little better every year. Trump is pulling screws and wires out of both of these systems.” Flanked by auto executives in the Oval Office, Trump announced that new vehicles in 2031 would only need to average 34.5 miles per gallon, down from the 50 miles per gallon goal the Biden administration set. While carmakers publicly cheered the move, executives “privately fretted” to The New York Times “that they are being buffeted by conflicting federal policies” after spending billions of dollars to prepare to manufacture electric vehicles.
The administration claimed the rollback would save Americans $109 billion over five years and shave $1,000 off the average cost of a new car. But as Rob noted in August, the administration’s fight against tailpipe emissions could actually end up raising the price of gasoline.

Secretary of Energy Chris Wright pitched tapping into backup generators at data centers, hospitals, and factories to augment the supply of power on the grid. Speaking at the North American Gas Forum on Tuesday, Wright said the generators — most of which run on diesel, natural gas, or fuels such as propane — could contribute roughly 35 gigawatts of electricity. “We have 35 gigawatts of backup generators that are sitting there today, and you can’t turn them on. That’s just nuts. Emissions rules or whatever … people, come on,” Wright said, according to E&E News. “If we just turn those generators on for a few hours a year, we’ve expanded the capacity of our grid by 35 gigawatts. That’s massive.”
In a post on X, Aaron Bryant, an energy markets analyst at the law firm White & Case, called the proposal “shortsighted at best,” since the generators expose load growth to some measure of commodity risk and “unworkable at worst” because zoning ordinances, air pollution, and noise restrictions may prohibit use of the generators.
The National Petroleum Council, an advisory panel at the Energy Department, submitted its recommendations Wednesday for how to reform federal permitting rules. Among the proposals was an endorsement of an idea to bar federal agencies from yanking already-granted permits. Democrats in Congress put forward the concept to prevent the Trump administration from reversing approvals for offshore turbines and other renewable projects targeted by the White House.
The proposal marks a significant step within the executive branch, given that Trump himself is “the biggest wild card in permitting reform,” as Heatmap’s Jael Holzman wrote last month. But legislation is moving in Congress. In the House, the SPEED Act overwhelmingly won a committee vote last month. Now Arkansas Senator Tom Cotton, a Republican, has introduced a new bill in the Senate with its own House version.
Sign up to receive Heatmap AM in your inbox every morning:
Following a summit between Trump and Chinese President Xi Jinping in October, Beijing agreed to overhaul its licensing regime for approving exports of rare earths to allow for streamlined permits to sell the metals overseas. At least three Chinese manufacturers of rare earth magnets have now secured new licenses to speed up exports to some customers, Reuters reported. It’s a sign of easing tensions between Washington and Beijing, offering some reprieve from the Chinese export restrictions that threatened to choke off the U.S. supply of key metals. But it’s still tenuous. China could ratchet up restrictions again, and the U.S. is still looking to increase domestic production of critical minerals to counter the leverage the People’s Republic wields through its near monopoly on the metals.
If there’s one thing Tim Latimer, the chief executive of the next-generation geothermal company Fervo Energy, wants to see in any permitting reform, it’s measures to making building new transmission lines easier. “The biggest threat to American global competitiveness, and it does not matter if your priorities are climate change, affordability, the AI race, national security or all of the above, is our country’s complete inability to build and upgrade transmission at any meaningful scale,” Latimer wrote in a post on X. Fervo is working on building the nation’s first full-scale next-generation geothermal plant in Utah, and running new transmission lines out to remote parts of the desert where it’s often best to drill for hot rocks is costly.
Fervo isn’t the only geothermal company making news. On Thursday morning, Zanskar, a geothermal startup that uses modern prospecting methods to find new conventional resources, announced that it had made the biggest “blind” discovery in the U.S. in more than 30 years. A “blind” find is a geothermal system that shows no visible signs of what’s below the surface, such as vents or geysers. While companies such as Fervo aim to use fracking technology to create reservoirs in hot rocks located where there aren’t underground aquatic formations to tap into, Zanskar is betting that using artificial intelligence to locate new conventional resources can result in faster, cheaper geothermal plants than next-generation technology can yield.
Here’s a little exclusive for you to end on: I got a copy of a letter signed by dozens of pro-nuclear advocates calling on New York state and local officials to kickstart an effort to rebuild the Indian Point nuclear plant just north of New York City. Describing the “forced premature closure” of the plant as “a major setback for New York,” the letter said the plant could be restored, noting that rising demand for clean, firm electricity has spurred utilities in Michigan, Iowa, and Pennsylvania to embark on historic restarts of decommissioned reactors. “Recommissioning Indian Point would stabilize electricity prices and deliver one of the fastest and largest returns of clean power available anywhere in the country,” the letter reads.
The Trump administration has started to weaken the rules requiring cars and trucks to get more fuel-efficient every year.
In a press event on Wednesday in the Oval Office, flanked by advisors and some of the country’s top auto executives, President Trump declared that the old rules “forced automakers to build cars using expensive technologies that drove up costs, drove up prices, and made the car much worse.”
He said that the rules were part of the “green new scam” and that ditching them would save consumers some $1,000 every year. That framed the rollback as part of the president’s seeming pivot to affordability, which has happened since Democrats trounced Republicans in the November off-cycle elections.
That pivot remains belated and at least a little half-hearted: On Wednesday, Trump made no mention of dropping the auto tariffs that are raising imported car prices by perhaps $5,000 per vehicle, according to Cox Automotive. Ditching the fuel economy rules, too, could increase demand for gasoline and thus raise prices at the pump — although they remain fairly low right now, with the national average below $3 a gallon.
What’s more interesting — and worrying — is that the rules fit into the administration’s broader war on innovation in the American car and light-duty truck sector.
The United States essentially has two ways to regulate pollution from cars and light trucks: It can limit greenhouse gas emissions from new cars and trucks, and it can require the fuel economy from new vehicles to get a little better every year.
Trump is pulling screws and wires out of both of these systems. In the first category, he’s begun to unwind the Environmental Protection Agency’s limits on carbon pollution from cars and light duty trucks, which he termed an “EV mandate.” (The Biden-era rules sought to require about half of new car sales be electric by 2030, although hybrids could help meet that standard.) Trump is also trying to keep the EPA from ever regulating anything to do with carbon pollution again by going after the agency’s “Endangerment Finding” — a scientific assessment that greenhouse gases are dangerous to human wellbeing.
That’s only half of the president’s war on air pollution rules, though. Since the oil crises of the 1970s, the National Highway Traffic Safety Administration has regulated fuel economy for new vehicles under the Corporate Average Fuel Economy, or CAFE, standards. When these rules are binding, the agency can require new cars and trucks sold in the U.S. to get a little more fuel-efficient every year. The idea is that these rules help limit the country’s gasoline consumption, thus keeping a lid on oil prices and letting the whole economy run more efficiently.
President Trump’s signature tax law, the One Big Beautiful Bill Act, already eliminated the fines that automakers have to pay when they fail to meet the standard. That change, pushed by Senator Ted Cruz of Texas, effectively rendered the regulation toothless. But now Trump is weakening the rules just for good measure. (At the press conference on Wednesday, Cruz stood behind the president — and next to Jim Farley, the CEO of Ford.)
Under the new Trump proposal, automakers would need to achieve only an average of 34.5 miles per gallon in 2031. Under Biden’s proposal, they needed to hit 50 miles per gallon that year.
Those numbers, I should add, are somewhat deceptive — because of how CAFE standards are calculated, the headline number is 20% to 30% stricter than a real-world fuel economy number. In essence, that means the new Trump era rules will come out to a real-world mile-per-gallon number in the mid-to-high 20s. That will give automakers ample regulatory room to sell more inefficient and gas-guzzling sport utility vehicles and pickups, which remain more profitable than electric vehicles.
Which is not ideal for air pollution or the energy transition. But the real risk for the American automaking industry is not that Ford might churn out a few extra Escapes over the next several years. It’s that the Trump proposal would eliminate the ability for automakers to trade compliance credits to meet the rules. These credit markets — which allow manufacturers of gas guzzlers to redeem themselves by buying credits generated by cleaner cars — have been a valuable revenue source for new vehicle companies like Tesla, Lucid, and Rivian. The Trump proposal would cut off that revenue — and with it, one of the few remaining ways that automakers are cross-subsidizing EV innovation in the United States.
During his campaign, President Trump said that he wanted the “cleanest air.” That promise is looking as incorrect as his pledge to cut electricity costs in half within a year.
How will America’s largest grid deal with the influx of electricity demand? It has until the end of the year to figure things out.
As America’s largest electricity market was deliberating over how to reform the interconnection of data centers, its independent market monitor threw a regulatory grenade into the mix. Just before the Thanksgiving holiday, the monitor filed a complaint with federal regulators saying that PJM Interconnection, which spans from Washington, D.C. to Ohio, should simply stop connecting new large data centers that it doesn’t have the capacity to serve reliably.
The complaint is just the latest development in a months-long debate involving the electricity market, power producers, utilities, elected officials, environmental activists, and consumer advocates over how to connect the deluge data centers in PJM’s 13-state territory without further increasing consumer electricity prices.
The system has been pushed into crisis by skyrocketing capacity auction prices, in which generators get paid to ensure they’re available when demand spikes. Those capacity auction prices have been fueled by high-octane demand projections, with PJM’s summer peak forecasted to jump from 154 gigawatts to 210 gigawatts in a decade. The 2034-35 forecast jumped 17% in just a year.
Over the past two two capacity auctions, actual and forecast data center growth has been responsible for over $16.6 billion in new costs, according to PJM’s independent market monitor; by contrast, the previous year’s auction generated a mere $2.2 billion. This has translated directly to higher retail electricity prices, including 20% increases in some parts of PJM’s territory, like New Jersey. It has also generated concerns about reliability of the whole system.
PJM wants to reform how data centers interconnect before the next capacity auction in June, but its members committee was unable to come to an agreement on a recommendation to PJM’s board during a November meeting. There were a dozen proposals, including one from the monitor; like all the others, it failed to garner the necessary two-thirds majority vote to be adopted formally.
So the monitor took its ideas straight to the top.
The market monitor’s complaint to the Federal Energy Regulatory Commission tracks closely with its plan at the November meeting. “PJM is currently proposing to allow the interconnection of large new data center loads that it cannot serve reliably and that will require load curtailments (black outs) of the data centers or of other customers at times. That result is not consistent with the basic responsibility of PJM to maintain a reliable grid and is therefore not just and reasonable,” the filing said. “Interconnecting large new data center loads when adequate capacity is not available is not providing reliable service.”
A PJM spokesperson told me, “We are still reviewing the complaint and will reserve comment at this time.”
But can its board still get a plan to FERC and avoid another blowout capacity auction?
“PJM is going to make a filing in December, no matter what. They have to get these rules in place to get to that next capacity auction in June,” Jon Gordon, policy director at Advanced Energy United, told me. “That’s what this has been about from the get-go. Nothing is going to stop PJM from filling something.”
The PJM spokesperson confirmed to me that “the board intends to act on large load additions to the system and is expected to provide an indication of its next steps over the next few weeks.” But especially after the membership’s failure to make a unified recommendation, what that proposal will be remains unclear. That has been a source of agita for the organizations’ many stakeholders.
“The absence of an affirmative advisory recommendation from the Members Committee creates uncertainty as to what reforms PJM’s Board of Managers may submit to the Federal Energy Regulatory Commission (FERC), and when stakeholders can expect that submission,” analysts at ClearView Energy Partners wrote in a note to clients. In spite of PJM’s commitments, they warned that the process could “slip into January,” which would give FERC just enough time to process the submission before the next capacity auction.
One idea did attract a majority vote from PJM’s membership: Southern Maryland Electric Cooperative’s, which largely echoed the PJM board’s own plan with some amendments. That suggestion called for a “Price Responsive Demand” system, in which electricity customers would agree to reduce their usage when wholesale prices spike. The system would be voluntary, unlike an earlier PJM proposal, which foresaw forcing large customers to curtail their power. “The load elects to not take on a capacity obligation, therefore does not pay for capacity, and is required to reduce demand during stressed system conditions,” PJM explained in an update. The Southern Maryland plan tweaks the PRD system to adjust its pricing mechanism. but largely aligns with what PJM’s staff put forward.
“There’s almost no real difference between the PJM proposal and that Southern Maryland proposal,” Gordon told me.
That might please restive stakeholders, or at least be something PJM’s board could go forward with knowing that the balance of its voting membership agreed with something similar.
“We maintain our view that a final proposal could resemble the proposed solution package from PJM staff,” the ClearView note said. “We also think the Board could propose reforms to PJM’s PRD program. Indeed, as noted above, SMECO’s revisions to the service gained majority support.”
The PJM plan also included relatively uncontroversial reforms to load forecasting to cut down on duplicated requests and better share information, and an “expedited interconnection track” on which new, large-scale generation could be fast-tracked if it were signed off on by a state government “to expedite consideration of permitting and siting.”
Gordon said that the market monitor’s complaint could be read as the organization “desperately trying to get FERC to weigh in” on its side, even if PJM is more likely to go with something like its own staff-authored submission.
“The key aspect of the market monitor’s proposal was that PJM should not allow a data center to interconnect until there was enough generation to supply them,” Gordon explained. During the meeting preceding the vote, “PJM said they didn’t think they had the authority to deny someone interconnection.”
This dispute over whether the electricity system has an obligation to serve all customers has been the existential question making the debate about how to serve data centers extra angsty.
But PJM looks to be trying to sidestep that big question and nibble around the edges of reform.
“Everybody is really conflicted here,” Gordon told me. “They’re all about protecting consumers. They don’t want to see any more increases, obviously, and they want to keep the lights on. Of course, they also want data center developers in their states. It’s really hard to have all three.”