You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
The rapid increase in demand for artificial intelligence is creating a seemingly vexing national dilemma: How can we meet the vast energy demands of a breakthrough industry without compromising our energy goals?
If that challenge sounds familiar, that’s because it is. The U.S. has a long history of rising to the electricity demands of innovative new industries. Our energy needs grew far more quickly in the four decades following World War II than what we are facing today. More recently, we have squared off against the energy requirements of new clean technologies that require significant energy to produce — most notably hydrogen.
Courtesy of Rhodium Group
The lesson we have learned time and again is that it is possible to scale technological innovation in a way that also scales energy innovation. Rather than accepting a zero-sum trade-off between innovation and our clean energy goals, we should focus on policies that leverage the growth of AI to scale the growth of clean energy.
At the core of this approach is the concept of additionality: Companies operating massive data centers — often referred to as “hyperscalers” — as well as utilities should have incentives to bring online new, additional clean energy to power new computing needs. That way, we leverage demand in one sector to scale up another. We drive innovation in key sectors that are critical to our nation’s competitiveness, we reward market leaders who are already moving in this direction with a stable, long-term regulatory framework for growth, and we stay on track to meet our nation’s climate commitments.
All of this is possible, but only if we take bold action now.
AI technologies have the potential to significantly boost America’s economic productivity and enhance our national security. AI also has the potential to accelerate the energy transition itself, from optimizing the electricity grid, to improving weather forecasting, to accelerating the discovery of chemicals and material breakthroughs that reduce reliance on fossil fuels. Powering AI, however, is itself incredibly energy intensive. Projections suggest that data centers could consume 9% of U.S. electricity generation by 2030, up from 4% today. Without a national policy response, this surge in energy demand risks increasing our long-term reliance on fossil fuels. By some estimates, around 20 gigawatts of additional natural gas generating capacity will come online by 2030, and coal plant retirements are already being delayed.
Avoiding this outcome will require creative focus on additionality. Hydrogen represents a particularly relevant case study here. It, too, is energy-intensive to produce — a single kilogram of hydrogen requires double the average household’s electricity consumption. And while hydrogen holds great promise to decarbonize parts of our economy, hydrogen is not per se good for our clean energy goals. Indeed, today’s fossil fuel-driven methods of hydrogen production generate more emissions than the entire aviation sector. While we can make zero-emissions hydrogen by using clean electricity to split hydrogen from water, the source of that electricity matters a lot. Similar to data centers, if the power for hydrogen production comes from the existing electricity grid, then ramping up electrolytic production of hydrogen could significantly increase emissions by growing overall energy demand without cleaning the energy mix.
This challenge led to the development of an “additionality” framework for hydrogen. The Inflation Reduction Act offers generous subsidies to hydrogen producers, but to qualify, they must match their electricity consumption with additional (read: newly built) clean energy generation close enough to them that they can actually use it.
This approach, which is being refined in proposed guidance from the U.S. Treasury Department, is designed to make sure that hydrogen’s energy demand becomes a catalyst for investment in new clean electricity generation and decarbonization technologies. Industry leaders are already responding, stating their readiness to build over 50 gigawatts of clean electrolyzer projects because of the long term certainty this framework provides.
While the scale and technology requirements are different, meeting AI’s energy needs presents a similar challenge. Powering data centers from the existing electricity grid mix means that more demand will create more emissions; even when data centers are drawing on clean electricity, if that energy is being diverted from existing sources rather than coming from new, additional clean electricity supply, the result is the same. Amazon’s recent $650 million investment in a data center campus next to an existing nuclear power plant in Pennsylvania illustrates the challenge: While diverting those clean electrons from Pennsylvania homes and businesses to the data center reduces Amazon’s reported emissions, by increasing demand on the grid without building additional clean capacity, it creates a need for new capacity in the region that will likely be met by fossil fuels (while also shifting up to $140 million of additional costs per year onto local customers).
Neither hyperscalers nor utilities should be expected to resolve this complex tension on their own. As with hydrogen, it is in our national interest to find a path forward.
What we need, then, is a national solution to make sure that as we expand our AI capabilities, we bring online new clean energy, as well, strengthening our competitive position in both industries and forestalling the economic and ecological consequences of higher electricity prices and higher carbon emissions.
In short, we should adopt a National AI Additionality Framework.
Under this framework, for any significant data center project, companies would need to show how they are securing new, additional clean power from a zero-emissions generation source. They could do this either by building new “behind-the-meter” clean energy to power their operations directly, or by partnering with a utility to pay a specified rate to secure new grid-connected clean energy coming online.
If companies are unwilling or unable to secure dedicated additional clean energy capacity, they would pay a fee into a clean deployment fund at the Department of Energy that would go toward high-value investments to expand clean electricity capacity. These could range from research and deployment incentives for so-called “clean firm” electricity generation technologies like nuclear and geothermal, to investments in transmission capacity in highly congested areas, to expanding manufacturing capacity for supply-constrained electrical grid equipment like transformers, to cleaning up rural electric cooperatives that serve areas attractive to data centers. Given the variance in grid and transmission issues, the fund would explicitly approach its investment with a regional lens.
Several states operate similar systems: Under Massachusetts’ Renewable Portfolio Standard, utilities are required to provide a certain percentage of electricity they serve from clean energy facilities or pay an “alternative compliance payment” for every megawatt-hour they are short of their obligation. Dollars collected from these payments go toward the development and expansion of clean energy projects and infrastructure in the state. Facing increasing capacity constraints on the PJM grid, Pennsylvania legislators are now exploring a state Baseload Energy Development Fund to provide low-interest grants and loans for new electricity generation facilities.
A national additionality framework should not only challenge the industry to scale innovation in a way that scales clean technology, it must also clear pathways to build clean energy at scale. We should establish a dedicated fast-track approval process to move these clean energy projects through federal, state, and local permitting and siting on an accelerated basis. This will help companies already investing in additional clean energy to move faster and more effectively – and make it more difficult for anyone to hide behind the excuse that building new clean energy capacity is too hard or too slow. Likewise, under this framework, utilities that stand in the way of progress should be held accountable and incentivized to adopt innovative new technologies and business models that enable them to move at historic speed.
For hyperscalers committed to net-zero goals, this national approach provides both an opportunity and a level playing field — an opportunity to deliver on those commitments in a genuine way, and a reliable long-term framework that will reward their investments to make that happen. This approach would also build public trust in corporate climate accountability and diminish the risk that those building data centers in the U.S. stand accused of greenwashing or shifting the cost of development onto ratepayers and communities. The policy clarity of an additionality requirement can also encourage cutting edge artificial intelligence technology to be built here in the United States. Moreover, it is a model that can be extended to address other sectors facing growing energy demand.
The good news is that many industry players are already moving in this direction. A new agreement between Google and a Nevada utility, for example, would allow Google to pay a higher rate for 24/7 clean electricity from a new geothermal project. In the Carolinas, Duke Energy announced its intent to explore a new clean tariff to support carbon-free energy generation for large customers like Google and Microsoft.
A national framework that builds on this progress is critical, though it will not be easy; it will require quick Congressional action, executive leadership, and new models of state and local partnership. But we have a unique opportunity to build a strange bedfellow coalition to get it done – across big tech, climate tech, environmentalists, permitting reform advocates, and those invested in America’s national security and technology leadership. Together, this framework can turn a vexing trade-off into an opportunity. We can ensure that the hundreds of billions of dollars invested in building an industry of the future actually accelerates the energy transition, all while strengthening the U.S.’s position in innovating cutting- edge AI and clean energy technology.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Kettle offers parametric insurance and says that it can cover just about any home — as long as the owner can afford the premium.
Los Angeles is on fire, and it’s possible that much of the city could burn to the ground. This would be a disaster for California’s already wobbly home insurance market and the residents who rely on it. Kettle Insurance, a fintech startup focused on wildfire insurance for Californians, thinks that it can offer a better solution.
The company, founded in 2020, has thousands of customers across California, and L.A. County is its largest market. These huge fires will, in some sense, “be a good test, not just for the industry, but for the Kettle model,” Brian Espie, the company’s chief underwriting officer, told me. What it’s offering is known as “parametric” insurance and reinsurance (essentially insurance for the insurers themselves.) While traditional insurance claims can take years to fully resolve — as some victims of the devastating 2018 Camp Fire know all too well — Kettle gives policyholders 60 days to submit a notice of loss, after which the company has 15 days to validate the claim and issue payment. There is no deductible.
As Espie explained, Kettle’s AI-powered risk assessment model is able to make more accurate and granular calculations, taking into account forward-looking, climate change-fueled challenges such as out-of-the-norm weather events, which couldn’t be predicted by looking at past weather patterns alone (e.g. wildfires in January, when historically L.A. is wet). Traditionally, California insurers have only been able to rely upon historical datasets to set their premiums, though that rule changed last year and never applied to parametric insurers in the first place.
“We’ve got about 70 different inputs from global satellite data and real estate ground level datasets that are combining to predict wildfire ignition and spread, and then also structural vulnerability,” Espie told me. “In total, we’re pulling from about 130 terabytes of data and then simulating millions of fires — so using technology that, frankly, wouldn’t have been possible 10 or maybe five years ago, because either the data didn’t exist, or it just wasn’t computationally possible to run a model like we are today.”
As of writing, it’s estimated that more than 2,000 structures have burned in Los Angeles. Whenever a fire encroaches on a parcel of Kettle-insured land, the owner immediately qualifies for a payout. Unlike most other parametric insurance plans, which pay a predetermined amount based on metrics such as the water level during a flood or the temperature during a heat wave regardless of damages, Kettle does require policyholders to submit damage estimates. The company told me that’s usually pretty simple: If a house burns, it’s almost certain that the losses will be equivalent to or exceed the policy limit, which can be up to $10 million. While the company can always audit a property to prevent insurance fraud, there are no claims adjusters or other third parties involved, thus expediting the process and eliminating much of the back-and-forth wrangling residents often go through with their insurance companies.
So how can Kettle afford to do all this while other insurers are exiting the California market altogether or pulling back in fire-prone regions? “We like to say that we can put a price on anything with our model,” Espie told me. “But I will say there are parts of the state that our model sees as burning every 10 to 15 years, and premiums may be just practically too expensive for insurance in those areas.” Kettle could also be an option for homeowners whose existing insurance comes with a very high wildfire deductible, Espie explained, as buying Kettle’s no-deductible plan in addition to their regular plan could actually save them money were a fire to occur.
But just because an area has traditionally been considered risky doesn’t mean that Kettle’s premiums will necessarily be exorbitant. The company’s CEO, Isaac Espinoza, told me that Kettle’s advanced modeling allows it to drill down on the risk to specific properties rather than just general regions. “We view ourselves as ensuring the uninsurable,” Espinoza said. “Other insurers just blanket say, we don’t want to touch it. We don’t touch anything in the area. We might say, ’Hey, that’s not too bad.’”
Espie told me that the wildly destructive fires in 2017 and 2018 “gave people a wake up call that maybe some of the traditional catastrophe models out there just weren’t keeping up with science and natural hazards in the face of climate change.” He thinks these latest blazes could represent a similar turning point for the industry. “This provides an opportunity for us to prove out that models built with AI and machine learning like ours can be more predictive of wildfire risk in the changing climate, where we’re getting 100 mile per hour winds in January.”
Everyone knows the story of Mrs. O’Leary’s cow, the one that allegedly knocked over a lantern in 1871 and burned down 2,100 acres of downtown Chicago. While the wildfires raging in Los Angeles County have already far exceeded that legendary bovine’s total attributed damage — at the time of this writing, on Thursday morning, five fires have burned more than 27,000 acres — the losses had centralized, at least initially, in the secluded neighborhoods and idyllic suburbs in the hills above the city.
On Wednesday, that started to change. Evacuation maps have since extended into the gridded streets of downtown Santa Monica and Pasadena, and a new fire has started north of Beverly Hills, moving quickly toward an internationally recognizable street: Hollywood Boulevard. The two biggest fires, Palisades and Eaton, remain 0% contained, and high winds have stymied firefighting efforts, all leading to an exceedingly grim question: Exactly how much of Los Angeles could burn. Could all of it?
“I hate to be doom and gloom, but if those winds kept up … it’s not unfathomable to think that the fires would continue to push into L.A. — into the city,” Riva Duncan, a former wildland firefighter and fire management specialist who now serves as the executive secretary of Grassroots Wildland Firefighters, an advocacy group, told me.
When a fire is burning in the chaparral of the hills, it’s one thing. But once a big fire catches in a neighborhood, it’s a different story. Houses, with their wood frames, gas lines, and cheap modern furniture, might as well be Duraflame. Embers from one burning house then leap to the next and alight in a clogged gutter or on shrubs planted too close to vinyl siding. “That’s what happened with the Great Chicago Fire. When the winds push fires like that, it’s pushing the embers from one house to the others,” Duncan said. “It’s a really horrible situation, but it’s not unfathomable to think about that [happening in L.A.] — but people need to be thinking about that, and I know the firefighters are thinking about that.”
Once flames engulf a block, it will “overpower” the capabilities of firefighters, Arnaud Trouvé, the chair of the Department of Fire Protection Engineering at the University of Maryland, told me in an email. If firefighters can’t gain a foothold, the fire will continue to spread “until a change in driving conditions,” such as the winds weakening to the point that a fire isn’t igniting new fuel or its fuel source running out entirely, when it reaches something like an expansive parking lot or the ocean.
This waiting game sometimes leads to the impression that firefighters are standing around, not doing anything. But “what I know they’re doing is they’re looking ahead to places where maybe there’s a park, or some kind of green space, or a shopping center with big parking lots — they’re looking for those places where they could make a stand,” Duncan told me. If an entire city block is already on fire, “they’re not going to waste precious water there.”
Urban firefighting is a different beast than wildland firefighting, but Duncan noted that Forest Service, CALFIRE, and L.A. County firefighters are used to complex mixed environments. “This is their backyard, and they know how to fight fire there.”
“I can guarantee you, many of them haven’t slept 48 hours,” she went on. “They’re grabbing food where they can; they’re taking 15-minute naps. They’re in this really horrible smoke — there are toxins that come off burning vehicles and burning homes, and wildland firefighters don’t wear breathing apparatus to protect the airways. I know they all have horrible headaches right now and are puking. I remember those days.”
If there’s a sliver of good news, it’s that the biggest fire, Palisades, can’t burn any further to the west, the direction the wind is blowing — there lies the ocean — meaning its spread south into Santa Monica toward Venice and Culver City or Beverly Hills is slower than it would be if the winds shifted. The westward-moving Santa Ana winds, however, could conceivably fan the Eaton fire deeper into eastern Los Angeles if conditions don’t let up soon. “In many open fires, the most important factor is the wind,” Trouvé explained, “and the fire will continue spreading until the wind speed becomes moderate-to-low.”
Though the wind died down a bit on Wednesday night, conditions are expected to deteriorate again Thursday evening, and the red flag warning won’t expire until Friday. And “there are additional winds coming next week,” Kristen Allison, a fire management specialist with the Southern California Geographic Area Coordination Center, told me Wednesday. “It’s going to be a long duration — and we’re not seeing any rain anytime soon.”
Editor’s note: Firefighting crews made “big gains” overnight against the Sunset fire, which threatened famous landmarks like the TLC Chinese Theater and the Dolby Theatre, which will host the Academy Awards in March. Most of the mandatory evacuation notices remaining in Hollywood on Thursday morning were out of precaution, the Los Angeles Times reported. Meanwhile, the Palisades and Eaton fires have burned a combined 27,834 acres, destroyed 2,000 structures, killed at least five people, and remain unchecked as the winds pick up again. This piece was last updated on January 9 at 10:30 a.m. ET.
On greenhouse gases, LA’s fires, and the growing costs of natural disasters
Current conditions: Winter storm Cora is expected to disrupt more than 5,000 U.S. flights • Britain’s grid operator is asking power plants for more electricity as temperatures plummet • Parts of Australia could reach 120 degrees Fahrenheit in the coming days because the monsoon, which usually appears sometime in December, has yet to show up.
The fire emergency in Los Angeles continues this morning, with at least five blazes raging in different parts of the nation’s second most-populated city. The largest, known as the Palisades fire, has charred more than 17,000 acres near Malibu and is now the most destructive fire in the county’s history. The Eaton fire near Altadena and Pasadena has grown to 10,600 acres. Both are 0% contained. Another fire ignited in Hollywood but is reportedly being contained. At least five people have died, more than 2,000 structures have been destroyed or damaged, 130,000 people are under evacuation warnings, and more than 300,000 customers are without power. Wind speeds have come down from the 100 mph gusts reported yesterday, but “high winds and low relative humidity will continue critical fire weather conditions in southern California through Friday,” the National Weather Service said.
Apu Gomes/Getty Images
As the scale of this disaster comes into focus, the finger-pointing has begun. President-elect Donald Trump blamed California Gov. Gavin Newsom, suggesting his wildlife protections have restricted the city’s water access. Many people slammed the city’s mayor for cutting the fire budget. Some suspect power lines are the source of the blazes, implicating major utility companies. And of course, underlying it all, is human-caused climate change, which researchers warn is increasing the frequency and severity of wildfires. “The big culprit we’re suspecting is a warming climate that’s making it easier to burn fuels when conditions are just right,” said University of Colorado fire scientist Jennifer Balch.
America’s greenhouse gas emissions were down in 2024 compared to 2023, but not by much, according to the Rhodium Group’s annual report, released this morning. The preliminary estimates suggest emissions fell by just 0.2% last year. In other words, they were basically flat. That’s good news in the sense that emissions didn’t rise, even as the economy grew by an estimated 2.7%. But it’s also a little worrying given that in 2023, emissions dropped by 3.3%.
Rhodium Group, EPA
The transportation, power, and buildings sectors all saw upticks in emissions last year. But there are some bright spots in the report. Emissions fell across the industrial sector (down 1.8%) and oil and gas sector (down 3.7%). Solar and wind power generation surpassed coal for the first time, and coal production fell by 12% to its lowest level in decades, resulting in fewer industrial methane emissions. Still, “the modest 2024 decline underscores the urgency of accelerating decarbonization in all sectors,” Rhodium’s report concluded. “To meet its Paris Agreement target of a 50-52% reduction in emissions by 2030, the U.S. must sustain an ambitious 7.6% annual drop in emissions from 2025 to 2030, a level the U.S. has not seen outside of a recession in recent memory.”
Insured losses from natural disasters topped $140 billion last year, up significantly from $106 billion in 2023, according to Munich Re, the world’s largest insurer. That makes 2024 the third most expensive year in terms of insured losses since 1980. Weather disasters, and especially major U.S. hurricanes, accounted for a large chunk ($47 billion) of these costs: Hurricanes Helene and Milton were the most devastating natural disasters of 2024. “Climate change is taking the gloves off,” the insurer said. “Hardly any other year has made the consequences of global warming so clear.”
Munich Re
A new study found that a quarter of all the world’s freshwater animals are facing a high risk of extinction due to pollution, farming, and dams. The research, published in the journal Nature, explained that freshwater sources – like rivers, lakes, marshes, and swamps – support over 10% of all known species, including fish, shrimps, and frogs. All these creatures support “essential ecosystem services,” including climate change mitigation and flood control. The report studied some 23,000 animals and found about 24% of the species were at high risk of extinction. The researchers said there “is urgency to act quickly to address threats to prevent further species declines and losses.”
A recent oil and gas lease sale in Alaska’s Arctic National Wildlife Refuge got zero bids, the Interior Department announced yesterday. This was the second sale – mandated by Congress under the 2017 Tax Act – to generate little interest. “The lack of interest from oil companies in development in the Arctic National Wildlife Refuge reflects what we and they have known all along – there are some places too special and sacred to put at risk with oil and gas drilling,” said Acting Deputy Secretary Laura Daniel-Davis. President-elect Donald Trump has promised to open more drilling in the refuge, calling it “the biggest find anywhere in the world, as big as Saudi Arabia.”
“Like it or not, addressing climate change requires the help of the wealthy – not just a small number of megadonors to environmental organizations, but the rich as a class. The more they understand that their money will not insulate them from the effects of a warming planet, the more likely they are to be allies in the climate fight, and vital ones at that.” –Paul Waldman writing for Heatmap