You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
Ask any climate wonk what’s holding back clean energy in the U.S. and you’re likely to get the same answer — not enough power lines. But what if the problem isn’t the number of power lines, but rather the outdated metal wires they’re made of?
Restringing transmission lines with more advanced wires, a process known as “reconductoring,” has the potential to double the amount of electricity our existing transmission system can handle, for less than half the price of building new lines. That’s the main finding of a recently published working paper from researchers at the University of California, Berkeley, and Gridlab, an energy consulting firm.
There are a few reasons that something as boring and seemingly ubiquitous as power lines are so crucial to the energy transition. Electrifying our cars and homes will increase demand for electricity, and much of the system is already too congested to integrate new wind and solar power plants. Plus, there just aren’t enough lines that run from the sunniest, windiest places to the places where most people actually live.
To realize the emission reduction potential of the clean energy subsidies in the Inflation Reduction Act, we have to more than double the rate of transmission expansion, according to research from Princeton University’s Repeat Project. Clean energy projects already face major delays and are often hit with exorbitant bills to connect to the grid. A study from Lawrence Berkeley National Laboratory called “Queued Up” found that at the end of 2022, there were more than 10,000 power plant and energy storage projects waiting for permission to connect to the grid — enough to double electricity production in the country. Some 95% of them were zero-carbon resources.
The main problem is permitting. Establishing rights-of-way for new power lines requires extensive environmental review and invites vicious local opposition. People don’t want to look at more wires strung across the landscape. They worry the eyesore will decrease their property value, or that the construction will hurt local ecosystems. New power lines often take upwards of 10 years to plan, permit, and build.
But it’s possible to avoid this time-consuming process, at least in many cases, by simply reconductoring lines along existing rights-of-way. Most of our existing power lines have a steel core surrounded by strands of aluminum. Advanced conductors replace the steel with a lighter but stronger core made of a composite material, such as carbon fiber. This subtle shift in materials and design enables the line to operate at higher temperatures, with less sag, significantly increasing the amount of power it can carry.
Advanced conductors cost two to four times more than conventional power lines — but upgrading an existing line to use advanced conductors can be less than half what a new power line would cost because it eliminates much of the construction spending and fees from permitting for new rights-of-way, the Berkeley study found.
“The most compelling, exciting thing is that it only requires a maintenance permit,” Duncan Callaway, an associate professor of energy and resources at Berkeley and one of the authors said while presenting the research over Zoom last week.
The paper highlights a 2016 project in southeastern Texas. Due to rapid population growth in the area, the local utility, American Electric Power, was seeing higher demand for electricity at peak times than it was prepared for, leading to blackouts. It needed to come up with a solution, fast, and decided that reconductoring 240 miles of its transmission lines would take less time than permitting new ones. The project ended up finishing ahead of schedule and under budget, at a cost of $900,000 per mile. By comparison, the 3,600 miles of new lines built under Texas’ Competitive Renewable Energy Zone program, which were built to connect wind-rich areas to population centers, cost more than double, at an average of $1.9 million per mile.
Callaway and his co-authors also plugged their findings into a power system expansion model — basically a computer program that maps out the most cost-effective mix of technologies to meet regional electric power demand. They fed the model a scenario where the only option for transmission was to build new lines at their slow, historical rate, as well as a scenario where there was also an option to reconductor along existing rights-of-way. The second scenario resulted in nearly four times as much transmission capacity by 2035, enabling the country to achieve a more than 90% clean electric grid by that date.
There are cases where new power lines are needed — for example, to establish a new route to access a high-quality renewable resource, Emilia Chojkiewicz, another author of the study, told me in an email. But she said it nearly always makes sense to consider reconductoring given the potential to double capacity and do so much more quickly. “Unfortunately,” she added, “current transmission planning practices do not tend to incentivize or even consider reconductoring.”
This all seems so ridiculously easy that it begs the question: Why aren’t utilities already rushing to do it? During the webinar last week, Chojkiewicz and her co-authors said part of the problem is just a lack of awareness and comfort with the technology. But the bigger issue is that utilities are not incentivized to look for cheaper, more efficient solutions like reconductoring because they profit off capital spending.
To change this, they suggested that the Federal Energy Regulatory Commission, which oversees interstate transmission, and state public service commissions, which regulate utilities at the state level, mandate the consideration of reconductoring in transmission and resource planning processes, and to properly value the benefits that advanced conductors provide. The Department of Energy could also consider instituting a national conductor efficiency standard, so that all new wires installed, whether along existing rights-of-way or new routes, achieve a minimum level of performance.
Reconductoring isn’t the only no-brainer alternative to building new power lines. Another study from the clean energy think tank RMI published last week illustrates the opportunity with even cheaper tweaks called “grid enhancing technologies.” One option is to install sensors that collect data on wind speed, temperature, and other factors that affect power lines in real time, called dynamic line ratings. These sensors allow utilities to safely increase the amount of power transmitted when weather conditions permit it. There are also power flow controls that can redirect power away from congested lines so that it can be transmitted elsewhere rather than wasted.
RMI found that in the PJM interconnection — a section of the grid in the eastern U.S. that is so congested the grid operator has frozen new applications to connect to it — these grid enhancing technologies could open up more than 6 gigawatts of new capacity to wind, solar, and storage projects in just three years. For reference, in 2022, nearly 300 gigawatts-worth of energy projects were waiting for permission to connect in PJM at the end 2022.
The cost savings are not just theoretical. In 2018, the PJM grid operator determined that a wind farm expansion in Illinois was going to require $100 million of grid upgrades — including building new lines and reconductoring existing ones — over a timeline of about three years before it would be able to connect. The developer countered that the needed upgrades could be achieved through power flow controls, which could be installed for a cost of just $12 million in less than half the time. PJM approved the idea, and the project is currently underway.
Congress is still debating how to reform permitting processes. But while that’s still a necessary step, it’s becoming increasingly clear that there’s a host of other outside-the-box solutions that can be deployed more quickly, in the near term. The IRA may have convinced the environmental movement that building new stuff was worth it, but there are still a lot of cases where the smarter choice is to renovate.
Editor’s note: This story has been updated to correct the cost of adding power flow controls to the PJM interconnection.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Kettle offers parametric insurance and says that it can cover just about any home — as long as the owner can afford the premium.
Los Angeles is on fire, and it’s possible that much of the city could burn to the ground. This would be a disaster for California’s already wobbly home insurance market and the residents who rely on it. Kettle Insurance, a fintech startup focused on wildfire insurance for Californians, thinks that it can offer a better solution.
The company, founded in 2020, has thousands of customers across California, and L.A. County is its largest market. These huge fires will, in some sense, “be a good test, not just for the industry, but for the Kettle model,” Brian Espie, the company’s chief underwriting officer, told me. What it’s offering is known as “parametric” insurance and reinsurance (essentially insurance for the insurers themselves.) While traditional insurance claims can take years to fully resolve — as some victims of the devastating 2018 Camp Fire know all too well — Kettle gives policyholders 60 days to submit a notice of loss, after which the company has 15 days to validate the claim and issue payment. There is no deductible.
As Espie explained, Kettle’s AI-powered risk assessment model is able to make more accurate and granular calculations, taking into account forward-looking, climate change-fueled challenges such as out-of-the-norm weather events, which couldn’t be predicted by looking at past weather patterns alone (e.g. wildfires in January, when historically L.A. is wet). Traditionally, California insurers have only been able to rely upon historical datasets to set their premiums, though that rule changed last year and never applied to parametric insurers in the first place.
“We’ve got about 70 different inputs from global satellite data and real estate ground level datasets that are combining to predict wildfire ignition and spread, and then also structural vulnerability,” Espie told me. “In total, we’re pulling from about 130 terabytes of data and then simulating millions of fires — so using technology that, frankly, wouldn’t have been possible 10 or maybe five years ago, because either the data didn’t exist, or it just wasn’t computationally possible to run a model like we are today.”
As of writing, it’s estimated that more than 2,000 structures have burned in Los Angeles. Whenever a fire encroaches on a parcel of Kettle-insured land, the owner immediately qualifies for a payout. Unlike most other parametric insurance plans, which pay a predetermined amount based on metrics such as the water level during a flood or the temperature during a heat wave regardless of damages, Kettle does require policyholders to submit damage estimates. The company told me that’s usually pretty simple: If a house burns, it’s almost certain that the losses will be equivalent to or exceed the policy limit, which can be up to $10 million. While the company can always audit a property to prevent insurance fraud, there are no claims adjusters or other third parties involved, thus expediting the process and eliminating much of the back-and-forth wrangling residents often go through with their insurance companies.
So how can Kettle afford to do all this while other insurers are exiting the California market altogether or pulling back in fire-prone regions? “We like to say that we can put a price on anything with our model,” Espie told me. “But I will say there are parts of the state that our model sees as burning every 10 to 15 years, and premiums may be just practically too expensive for insurance in those areas.” Kettle could also be an option for homeowners whose existing insurance comes with a very high wildfire deductible, Espie explained, as buying Kettle’s no-deductible plan in addition to their regular plan could actually save them money were a fire to occur.
But just because an area has traditionally been considered risky doesn’t mean that Kettle’s premiums will necessarily be exorbitant. The company’s CEO, Isaac Espinoza, told me that Kettle’s advanced modeling allows it to drill down on the risk to specific properties rather than just general regions. “We view ourselves as ensuring the uninsurable,” Espinoza said. “Other insurers just blanket say, we don’t want to touch it. We don’t touch anything in the area. We might say, ’Hey, that’s not too bad.’”
Espie told me that the wildly destructive fires in 2017 and 2018 “gave people a wake up call that maybe some of the traditional catastrophe models out there just weren’t keeping up with science and natural hazards in the face of climate change.” He thinks these latest blazes could represent a similar turning point for the industry. “This provides an opportunity for us to prove out that models built with AI and machine learning like ours can be more predictive of wildfire risk in the changing climate, where we’re getting 100 mile per hour winds in January.”
Everyone knows the story of Mrs. O’Leary’s cow, the one that allegedly knocked over a lantern in 1871 and burned down 2,100 acres of downtown Chicago. While the wildfires raging in Los Angeles County have already far exceeded that legendary bovine’s total attributed damage — at the time of this writing, on Thursday morning, five fires have burned more than 27,000 acres — the losses had centralized, at least initially, in the secluded neighborhoods and idyllic suburbs in the hills above the city.
On Wednesday, that started to change. Evacuation maps have since extended into the gridded streets of downtown Santa Monica and Pasadena, and a new fire has started north of Beverly Hills, moving quickly toward an internationally recognizable street: Hollywood Boulevard. The two biggest fires, Palisades and Eaton, remain 0% contained, and high winds have stymied firefighting efforts, all leading to an exceedingly grim question: Exactly how much of Los Angeles could burn. Could all of it?
“I hate to be doom and gloom, but if those winds kept up … it’s not unfathomable to think that the fires would continue to push into L.A. — into the city,” Riva Duncan, a former wildland firefighter and fire management specialist who now serves as the executive secretary of Grassroots Wildland Firefighters, an advocacy group, told me.
When a fire is burning in the chaparral of the hills, it’s one thing. But once a big fire catches in a neighborhood, it’s a different story. Houses, with their wood frames, gas lines, and cheap modern furniture, might as well be Duraflame. Embers from one burning house then leap to the next and alight in a clogged gutter or on shrubs planted too close to vinyl siding. “That’s what happened with the Great Chicago Fire. When the winds push fires like that, it’s pushing the embers from one house to the others,” Duncan said. “It’s a really horrible situation, but it’s not unfathomable to think about that [happening in L.A.] — but people need to be thinking about that, and I know the firefighters are thinking about that.”
Once flames engulf a block, it will “overpower” the capabilities of firefighters, Arnaud Trouvé, the chair of the Department of Fire Protection Engineering at the University of Maryland, told me in an email. If firefighters can’t gain a foothold, the fire will continue to spread “until a change in driving conditions,” such as the winds weakening to the point that a fire isn’t igniting new fuel or its fuel source running out entirely, when it reaches something like an expansive parking lot or the ocean.
This waiting game sometimes leads to the impression that firefighters are standing around, not doing anything. But “what I know they’re doing is they’re looking ahead to places where maybe there’s a park, or some kind of green space, or a shopping center with big parking lots — they’re looking for those places where they could make a stand,” Duncan told me. If an entire city block is already on fire, “they’re not going to waste precious water there.”
Urban firefighting is a different beast than wildland firefighting, but Duncan noted that Forest Service, CALFIRE, and L.A. County firefighters are used to complex mixed environments. “This is their backyard, and they know how to fight fire there.”
“I can guarantee you, many of them haven’t slept 48 hours,” she went on. “They’re grabbing food where they can; they’re taking 15-minute naps. They’re in this really horrible smoke — there are toxins that come off burning vehicles and burning homes, and wildland firefighters don’t wear breathing apparatus to protect the airways. I know they all have horrible headaches right now and are puking. I remember those days.”
If there’s a sliver of good news, it’s that the biggest fire, Palisades, can’t burn any further to the west, the direction the wind is blowing — there lies the ocean — meaning its spread south into Santa Monica toward Venice and Culver City or Beverly Hills is slower than it would be if the winds shifted. The westward-moving Santa Ana winds, however, could conceivably fan the Eaton fire deeper into eastern Los Angeles if conditions don’t let up soon. “In many open fires, the most important factor is the wind,” Trouvé explained, “and the fire will continue spreading until the wind speed becomes moderate-to-low.”
Though the wind died down a bit on Wednesday night, conditions are expected to deteriorate again Thursday evening, and the red flag warning won’t expire until Friday. And “there are additional winds coming next week,” Kristen Allison, a fire management specialist with the Southern California Geographic Area Coordination Center, told me Wednesday. “It’s going to be a long duration — and we’re not seeing any rain anytime soon.”
Editor’s note: Firefighting crews made “big gains” overnight against the Sunset fire, which threatened famous landmarks like the TLC Chinese Theater and the Dolby Theatre, which will host the Academy Awards in March. Most of the mandatory evacuation notices remaining in Hollywood on Thursday morning were out of precaution, the Los Angeles Times reported. Meanwhile, the Palisades and Eaton fires have burned a combined 27,834 acres, destroyed 2,000 structures, killed at least five people, and remain unchecked as the winds pick up again. This piece was last updated on January 9 at 10:30 a.m. ET.
On greenhouse gases, LA’s fires, and the growing costs of natural disasters
Current conditions: Winter storm Cora is expected to disrupt more than 5,000 U.S. flights • Britain’s grid operator is asking power plants for more electricity as temperatures plummet • Parts of Australia could reach 120 degrees Fahrenheit in the coming days because the monsoon, which usually appears sometime in December, has yet to show up.
The fire emergency in Los Angeles continues this morning, with at least five blazes raging in different parts of the nation’s second most-populated city. The largest, known as the Palisades fire, has charred more than 17,000 acres near Malibu and is now the most destructive fire in the county’s history. The Eaton fire near Altadena and Pasadena has grown to 10,600 acres. Both are 0% contained. Another fire ignited in Hollywood but is reportedly being contained. At least five people have died, more than 2,000 structures have been destroyed or damaged, 130,000 people are under evacuation warnings, and more than 300,000 customers are without power. Wind speeds have come down from the 100 mph gusts reported yesterday, but “high winds and low relative humidity will continue critical fire weather conditions in southern California through Friday,” the National Weather Service said.
Apu Gomes/Getty Images
As the scale of this disaster comes into focus, the finger-pointing has begun. President-elect Donald Trump blamed California Gov. Gavin Newsom, suggesting his wildlife protections have restricted the city’s water access. Many people slammed the city’s mayor for cutting the fire budget. Some suspect power lines are the source of the blazes, implicating major utility companies. And of course, underlying it all, is human-caused climate change, which researchers warn is increasing the frequency and severity of wildfires. “The big culprit we’re suspecting is a warming climate that’s making it easier to burn fuels when conditions are just right,” said University of Colorado fire scientist Jennifer Balch.
America’s greenhouse gas emissions were down in 2024 compared to 2023, but not by much, according to the Rhodium Group’s annual report, released this morning. The preliminary estimates suggest emissions fell by just 0.2% last year. In other words, they were basically flat. That’s good news in the sense that emissions didn’t rise, even as the economy grew by an estimated 2.7%. But it’s also a little worrying given that in 2023, emissions dropped by 3.3%.
Rhodium Group, EPA
The transportation, power, and buildings sectors all saw upticks in emissions last year. But there are some bright spots in the report. Emissions fell across the industrial sector (down 1.8%) and oil and gas sector (down 3.7%). Solar and wind power generation surpassed coal for the first time, and coal production fell by 12% to its lowest level in decades, resulting in fewer industrial methane emissions. Still, “the modest 2024 decline underscores the urgency of accelerating decarbonization in all sectors,” Rhodium’s report concluded. “To meet its Paris Agreement target of a 50-52% reduction in emissions by 2030, the U.S. must sustain an ambitious 7.6% annual drop in emissions from 2025 to 2030, a level the U.S. has not seen outside of a recession in recent memory.”
Insured losses from natural disasters topped $140 billion last year, up significantly from $106 billion in 2023, according to Munich Re, the world’s largest insurer. That makes 2024 the third most expensive year in terms of insured losses since 1980. Weather disasters, and especially major U.S. hurricanes, accounted for a large chunk ($47 billion) of these costs: Hurricanes Helene and Milton were the most devastating natural disasters of 2024. “Climate change is taking the gloves off,” the insurer said. “Hardly any other year has made the consequences of global warming so clear.”
Munich Re
A new study found that a quarter of all the world’s freshwater animals are facing a high risk of extinction due to pollution, farming, and dams. The research, published in the journal Nature, explained that freshwater sources – like rivers, lakes, marshes, and swamps – support over 10% of all known species, including fish, shrimps, and frogs. All these creatures support “essential ecosystem services,” including climate change mitigation and flood control. The report studied some 23,000 animals and found about 24% of the species were at high risk of extinction. The researchers said there “is urgency to act quickly to address threats to prevent further species declines and losses.”
A recent oil and gas lease sale in Alaska’s Arctic National Wildlife Refuge got zero bids, the Interior Department announced yesterday. This was the second sale – mandated by Congress under the 2017 Tax Act – to generate little interest. “The lack of interest from oil companies in development in the Arctic National Wildlife Refuge reflects what we and they have known all along – there are some places too special and sacred to put at risk with oil and gas drilling,” said Acting Deputy Secretary Laura Daniel-Davis. President-elect Donald Trump has promised to open more drilling in the refuge, calling it “the biggest find anywhere in the world, as big as Saudi Arabia.”
“Like it or not, addressing climate change requires the help of the wealthy – not just a small number of megadonors to environmental organizations, but the rich as a class. The more they understand that their money will not insulate them from the effects of a warming planet, the more likely they are to be allies in the climate fight, and vital ones at that.” –Paul Waldman writing for Heatmap