Sign In or Create an Account.

By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy

Ideas

America Needs an Energy Policy for AI

Additionality isn’t just for hydrogen.

Circuits and a wind turbine.
Heatmap Illustration/Getty Images

The rapid increase in demand for artificial intelligence is creating a seemingly vexing national dilemma: How can we meet the vast energy demands of a breakthrough industry without compromising our energy goals?

If that challenge sounds familiar, that’s because it is. The U.S. has a long history of rising to the electricity demands of innovative new industries. Our energy needs grew far more quickly in the four decades following World War II than what we are facing today. More recently, we have squared off against the energy requirements of new clean technologies that require significant energy to produce — most notably hydrogen.

Electricity Demand Since the 1950s

Courtesy of Rhodium Group

The lesson we have learned time and again is that it is possible to scale technological innovation in a way that also scales energy innovation. Rather than accepting a zero-sum trade-off between innovation and our clean energy goals, we should focus on policies that leverage the growth of AI to scale the growth of clean energy.

At the core of this approach is the concept of additionality: Companies operating massive data centers — often referred to as “hyperscalers” — as well as utilities should have incentives to bring online new, additional clean energy to power new computing needs. That way, we leverage demand in one sector to scale up another. We drive innovation in key sectors that are critical to our nation’s competitiveness, we reward market leaders who are already moving in this direction with a stable, long-term regulatory framework for growth, and we stay on track to meet our nation’s climate commitments.

All of this is possible, but only if we take bold action now.

AI technologies have the potential to significantly boost America’s economic productivity and enhance our national security. AI also has the potential to accelerate the energy transition itself, from optimizing the electricity grid, to improving weather forecasting, to accelerating the discovery of chemicals and material breakthroughs that reduce reliance on fossil fuels. Powering AI, however, is itself incredibly energy intensive. Projections suggest that data centers could consume 9% of U.S. electricity generation by 2030, up from 4% today. Without a national policy response, this surge in energy demand risks increasing our long-term reliance on fossil fuels. By some estimates, around 20 gigawatts of additional natural gas generating capacity will come online by 2030, and coal plant retirements are already being delayed.

Avoiding this outcome will require creative focus on additionality. Hydrogen represents a particularly relevant case study here. It, too, is energy-intensive to produce — a single kilogram of hydrogen requires double the average household’s electricity consumption. And while hydrogen holds great promise to decarbonize parts of our economy, hydrogen is not per se good for our clean energy goals. Indeed, today’s fossil fuel-driven methods of hydrogen production generate more emissions than the entire aviation sector. While we can make zero-emissions hydrogen by using clean electricity to split hydrogen from water, the source of that electricity matters a lot. Similar to data centers, if the power for hydrogen production comes from the existing electricity grid, then ramping up electrolytic production of hydrogen could significantly increase emissions by growing overall energy demand without cleaning the energy mix.

This challenge led to the development of an “additionality” framework for hydrogen. The Inflation Reduction Act offers generous subsidies to hydrogen producers, but to qualify, they must match their electricity consumption with additional (read: newly built) clean energy generation close enough to them that they can actually use it.

This approach, which is being refined in proposed guidance from the U.S. Treasury Department, is designed to make sure that hydrogen’s energy demand becomes a catalyst for investment in new clean electricity generation and decarbonization technologies. Industry leaders are already responding, stating their readiness to build over 50 gigawatts of clean electrolyzer projects because of the long term certainty this framework provides.

While the scale and technology requirements are different, meeting AI’s energy needs presents a similar challenge. Powering data centers from the existing electricity grid mix means that more demand will create more emissions; even when data centers are drawing on clean electricity, if that energy is being diverted from existing sources rather than coming from new, additional clean electricity supply, the result is the same. Amazon’s recent $650 million investment in a data center campus next to an existing nuclear power plant in Pennsylvania illustrates the challenge: While diverting those clean electrons from Pennsylvania homes and businesses to the data center reduces Amazon’s reported emissions, by increasing demand on the grid without building additional clean capacity, it creates a need for new capacity in the region that will likely be met by fossil fuels (while also shifting up to $140 million of additional costs per year onto local customers).

Neither hyperscalers nor utilities should be expected to resolve this complex tension on their own. As with hydrogen, it is in our national interest to find a path forward.

What we need, then, is a national solution to make sure that as we expand our AI capabilities, we bring online new clean energy, as well, strengthening our competitive position in both industries and forestalling the economic and ecological consequences of higher electricity prices and higher carbon emissions.

In short, we should adopt a National AI Additionality Framework.

Under this framework, for any significant data center project, companies would need to show how they are securing new, additional clean power from a zero-emissions generation source. They could do this either by building new “behind-the-meter” clean energy to power their operations directly, or by partnering with a utility to pay a specified rate to secure new grid-connected clean energy coming online.

If companies are unwilling or unable to secure dedicated additional clean energy capacity, they would pay a fee into a clean deployment fund at the Department of Energy that would go toward high-value investments to expand clean electricity capacity. These could range from research and deployment incentives for so-called “clean firm” electricity generation technologies like nuclear and geothermal, to investments in transmission capacity in highly congested areas, to expanding manufacturing capacity for supply-constrained electrical grid equipment like transformers, to cleaning up rural electric cooperatives that serve areas attractive to data centers. Given the variance in grid and transmission issues, the fund would explicitly approach its investment with a regional lens.

Several states operate similar systems: Under Massachusetts’ Renewable Portfolio Standard, utilities are required to provide a certain percentage of electricity they serve from clean energy facilities or pay an “alternative compliance payment” for every megawatt-hour they are short of their obligation. Dollars collected from these payments go toward the development and expansion of clean energy projects and infrastructure in the state. Facing increasing capacity constraints on the PJM grid, Pennsylvania legislators are now exploring a state Baseload Energy Development Fund to provide low-interest grants and loans for new electricity generation facilities.

A national additionality framework should not only challenge the industry to scale innovation in a way that scales clean technology, it must also clear pathways to build clean energy at scale. We should establish a dedicated fast-track approval process to move these clean energy projects through federal, state, and local permitting and siting on an accelerated basis. This will help companies already investing in additional clean energy to move faster and more effectively – and make it more difficult for anyone to hide behind the excuse that building new clean energy capacity is too hard or too slow. Likewise, under this framework, utilities that stand in the way of progress should be held accountable and incentivized to adopt innovative new technologies and business models that enable them to move at historic speed.

For hyperscalers committed to net-zero goals, this national approach provides both an opportunity and a level playing field — an opportunity to deliver on those commitments in a genuine way, and a reliable long-term framework that will reward their investments to make that happen. This approach would also build public trust in corporate climate accountability and diminish the risk that those building data centers in the U.S. stand accused of greenwashing or shifting the cost of development onto ratepayers and communities. The policy clarity of an additionality requirement can also encourage cutting edge artificial intelligence technology to be built here in the United States. Moreover, it is a model that can be extended to address other sectors facing growing energy demand.

The good news is that many industry players are already moving in this direction. A new agreement between Google and a Nevada utility, for example, would allow Google to pay a higher rate for 24/7 clean electricity from a new geothermal project. In the Carolinas, Duke Energy announced its intent to explore a new clean tariff to support carbon-free energy generation for large customers like Google and Microsoft.

A national framework that builds on this progress is critical, though it will not be easy; it will require quick Congressional action, executive leadership, and new models of state and local partnership. But we have a unique opportunity to build a strange bedfellow coalition to get it done – across big tech, climate tech, environmentalists, permitting reform advocates, and those invested in America’s national security and technology leadership. Together, this framework can turn a vexing trade-off into an opportunity. We can ensure that the hundreds of billions of dollars invested in building an industry of the future actually accelerates the energy transition, all while strengthening the U.S.’s position in innovating cutting- edge AI and clean energy technology.

Blue

You’re out of free articles.

Subscribe today to experience Heatmap’s expert analysis 
of climate change, clean energy, and sustainability.
To continue reading
Create a free account or sign in to unlock more free articles.
or
Please enter an email address
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Sparks

An Insurance Startup Faces a Major Test in Los Angeles

Kettle offers parametric insurance and says that it can cover just about any home — as long as the owner can afford the premium.

Los Angeles fire destruction.
Heatmap Illustration/Getty Images

Los Angeles is on fire, and it’s possible that much of the city could burn to the ground. This would be a disaster for California’s already wobbly home insurance market and the residents who rely on it. Kettle Insurance, a fintech startup focused on wildfire insurance for Californians, thinks that it can offer a better solution.

The company, founded in 2020, has thousands of customers across California, and L.A. County is its largest market. These huge fires will, in some sense, “be a good test, not just for the industry, but for the Kettle model,” Brian Espie, the company’s chief underwriting officer, told me. What it’s offering is known as “parametric” insurance and reinsurance (essentially insurance for the insurers themselves.) While traditional insurance claims can take years to fully resolve — as some victims of the devastating 2018 Camp Fire know all too well — Kettle gives policyholders 60 days to submit a notice of loss, after which the company has 15 days to validate the claim and issue payment. There is no deductible.

Keep reading...Show less
Chicago and Los Angeles fires.
Heatmap Illustration/Getty Images

Everyone knows the story of Mrs. O’Leary’s cow, the one that allegedly knocked over a lantern in 1871 and burned down 2,100 acres of downtown Chicago. While the wildfires raging in Los Angeles County have already far exceeded that legendary bovine’s total attributed damage — at the time of this writing, on Thursday morning, five fires have burned more than 27,000 acres — the losses had centralized, at least initially, in the secluded neighborhoods and idyllic suburbs in the hills above the city.

On Wednesday, that started to change. Evacuation maps have since extended into the gridded streets of downtown Santa Monica and Pasadena, and a new fire has started north of Beverly Hills, moving quickly toward an internationally recognizable street: Hollywood Boulevard. The two biggest fires, Palisades and Eaton, remain 0% contained, and high winds have stymied firefighting efforts, all leading to an exceedingly grim question: Exactly how much of Los Angeles could burn. Could all of it?

Keep reading...Show less
Climate

AM Briefing: America’s 2024 Emissions

On greenhouse gases, LA’s fires, and the growing costs of natural disasters

What Happened to America’s Emissions in 2024?
Heatmap Illustration/Getty Images

Current conditions: Winter storm Cora is expected to disrupt more than 5,000 U.S. flights • Britain’s grid operator is asking power plants for more electricity as temperatures plummet • Parts of Australia could reach 120 degrees Fahrenheit in the coming days because the monsoon, which usually appears sometime in December, has yet to show up.

THE TOP FIVE

1. Los Angeles fires rage on

The fire emergency in Los Angeles continues this morning, with at least five blazes raging in different parts of the nation’s second most-populated city. The largest, known as the Palisades fire, has charred more than 17,000 acres near Malibu and is now the most destructive fire in the county’s history. The Eaton fire near Altadena and Pasadena has grown to 10,600 acres. Both are 0% contained. Another fire ignited in Hollywood but is reportedly being contained. At least five people have died, more than 2,000 structures have been destroyed or damaged, 130,000 people are under evacuation warnings, and more than 300,000 customers are without power. Wind speeds have come down from the 100 mph gusts reported yesterday, but “high winds and low relative humidity will continue critical fire weather conditions in southern California through Friday,” the National Weather Service said.

Keep reading...Show less
Yellow