Sign In or Create an Account.

By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy

Ideas

America Needs an Energy Policy for AI

Additionality isn’t just for hydrogen.

Circuits and a wind turbine.
Heatmap Illustration/Getty Images

The rapid increase in demand for artificial intelligence is creating a seemingly vexing national dilemma: How can we meet the vast energy demands of a breakthrough industry without compromising our energy goals?

If that challenge sounds familiar, that’s because it is. The U.S. has a long history of rising to the electricity demands of innovative new industries. Our energy needs grew far more quickly in the four decades following World War II than what we are facing today. More recently, we have squared off against the energy requirements of new clean technologies that require significant energy to produce — most notably hydrogen.

Electricity Demand Since the 1950s

Courtesy of Rhodium Group

The lesson we have learned time and again is that it is possible to scale technological innovation in a way that also scales energy innovation. Rather than accepting a zero-sum trade-off between innovation and our clean energy goals, we should focus on policies that leverage the growth of AI to scale the growth of clean energy.

At the core of this approach is the concept of additionality: Companies operating massive data centers — often referred to as “hyperscalers” — as well as utilities should have incentives to bring online new, additional clean energy to power new computing needs. That way, we leverage demand in one sector to scale up another. We drive innovation in key sectors that are critical to our nation’s competitiveness, we reward market leaders who are already moving in this direction with a stable, long-term regulatory framework for growth, and we stay on track to meet our nation’s climate commitments.

All of this is possible, but only if we take bold action now.

AI technologies have the potential to significantly boost America’s economic productivity and enhance our national security. AI also has the potential to accelerate the energy transition itself, from optimizing the electricity grid, to improving weather forecasting, to accelerating the discovery of chemicals and material breakthroughs that reduce reliance on fossil fuels. Powering AI, however, is itself incredibly energy intensive. Projections suggest that data centers could consume 9% of U.S. electricity generation by 2030, up from 4% today. Without a national policy response, this surge in energy demand risks increasing our long-term reliance on fossil fuels. By some estimates, around 20 gigawatts of additional natural gas generating capacity will come online by 2030, and coal plant retirements are already being delayed.

Avoiding this outcome will require creative focus on additionality. Hydrogen represents a particularly relevant case study here. It, too, is energy-intensive to produce — a single kilogram of hydrogen requires double the average household’s electricity consumption. And while hydrogen holds great promise to decarbonize parts of our economy, hydrogen is not per se good for our clean energy goals. Indeed, today’s fossil fuel-driven methods of hydrogen production generate more emissions than the entire aviation sector. While we can make zero-emissions hydrogen by using clean electricity to split hydrogen from water, the source of that electricity matters a lot. Similar to data centers, if the power for hydrogen production comes from the existing electricity grid, then ramping up electrolytic production of hydrogen could significantly increase emissions by growing overall energy demand without cleaning the energy mix.

This challenge led to the development of an “additionality” framework for hydrogen. The Inflation Reduction Act offers generous subsidies to hydrogen producers, but to qualify, they must match their electricity consumption with additional (read: newly built) clean energy generation close enough to them that they can actually use it.

This approach, which is being refined in proposed guidance from the U.S. Treasury Department, is designed to make sure that hydrogen’s energy demand becomes a catalyst for investment in new clean electricity generation and decarbonization technologies. Industry leaders are already responding, stating their readiness to build over 50 gigawatts of clean electrolyzer projects because of the long term certainty this framework provides.

While the scale and technology requirements are different, meeting AI’s energy needs presents a similar challenge. Powering data centers from the existing electricity grid mix means that more demand will create more emissions; even when data centers are drawing on clean electricity, if that energy is being diverted from existing sources rather than coming from new, additional clean electricity supply, the result is the same. Amazon’s recent $650 million investment in a data center campus next to an existing nuclear power plant in Pennsylvania illustrates the challenge: While diverting those clean electrons from Pennsylvania homes and businesses to the data center reduces Amazon’s reported emissions, by increasing demand on the grid without building additional clean capacity, it creates a need for new capacity in the region that will likely be met by fossil fuels (while also shifting up to $140 million of additional costs per year onto local customers).

Neither hyperscalers nor utilities should be expected to resolve this complex tension on their own. As with hydrogen, it is in our national interest to find a path forward.

What we need, then, is a national solution to make sure that as we expand our AI capabilities, we bring online new clean energy, as well, strengthening our competitive position in both industries and forestalling the economic and ecological consequences of higher electricity prices and higher carbon emissions.

In short, we should adopt a National AI Additionality Framework.

Under this framework, for any significant data center project, companies would need to show how they are securing new, additional clean power from a zero-emissions generation source. They could do this either by building new “behind-the-meter” clean energy to power their operations directly, or by partnering with a utility to pay a specified rate to secure new grid-connected clean energy coming online.

If companies are unwilling or unable to secure dedicated additional clean energy capacity, they would pay a fee into a clean deployment fund at the Department of Energy that would go toward high-value investments to expand clean electricity capacity. These could range from research and deployment incentives for so-called “clean firm” electricity generation technologies like nuclear and geothermal, to investments in transmission capacity in highly congested areas, to expanding manufacturing capacity for supply-constrained electrical grid equipment like transformers, to cleaning up rural electric cooperatives that serve areas attractive to data centers. Given the variance in grid and transmission issues, the fund would explicitly approach its investment with a regional lens.

Several states operate similar systems: Under Massachusetts’ Renewable Portfolio Standard, utilities are required to provide a certain percentage of electricity they serve from clean energy facilities or pay an “alternative compliance payment” for every megawatt-hour they are short of their obligation. Dollars collected from these payments go toward the development and expansion of clean energy projects and infrastructure in the state. Facing increasing capacity constraints on the PJM grid, Pennsylvania legislators are now exploring a state Baseload Energy Development Fund to provide low-interest grants and loans for new electricity generation facilities.

A national additionality framework should not only challenge the industry to scale innovation in a way that scales clean technology, it must also clear pathways to build clean energy at scale. We should establish a dedicated fast-track approval process to move these clean energy projects through federal, state, and local permitting and siting on an accelerated basis. This will help companies already investing in additional clean energy to move faster and more effectively – and make it more difficult for anyone to hide behind the excuse that building new clean energy capacity is too hard or too slow. Likewise, under this framework, utilities that stand in the way of progress should be held accountable and incentivized to adopt innovative new technologies and business models that enable them to move at historic speed.

For hyperscalers committed to net-zero goals, this national approach provides both an opportunity and a level playing field — an opportunity to deliver on those commitments in a genuine way, and a reliable long-term framework that will reward their investments to make that happen. This approach would also build public trust in corporate climate accountability and diminish the risk that those building data centers in the U.S. stand accused of greenwashing or shifting the cost of development onto ratepayers and communities. The policy clarity of an additionality requirement can also encourage cutting edge artificial intelligence technology to be built here in the United States. Moreover, it is a model that can be extended to address other sectors facing growing energy demand.

The good news is that many industry players are already moving in this direction. A new agreement between Google and a Nevada utility, for example, would allow Google to pay a higher rate for 24/7 clean electricity from a new geothermal project. In the Carolinas, Duke Energy announced its intent to explore a new clean tariff to support carbon-free energy generation for large customers like Google and Microsoft.

A national framework that builds on this progress is critical, though it will not be easy; it will require quick Congressional action, executive leadership, and new models of state and local partnership. But we have a unique opportunity to build a strange bedfellow coalition to get it done – across big tech, climate tech, environmentalists, permitting reform advocates, and those invested in America’s national security and technology leadership. Together, this framework can turn a vexing trade-off into an opportunity. We can ensure that the hundreds of billions of dollars invested in building an industry of the future actually accelerates the energy transition, all while strengthening the U.S.’s position in innovating cutting- edge AI and clean energy technology.

Blue

You’re out of free articles.

Subscribe today to experience Heatmap’s expert analysis 
of climate change, clean energy, and sustainability.
To continue reading
Create a free account or sign in to unlock more free articles.
or
Please enter an email address
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Ideas

Abundance, Not Additionality, Will Meet the Energy Demands of AI

A counter-proposal for the country’s energy future.

Power lines.
Heatmap Illustration/Getty Images

American electricity consumption is growing for the first time in generations. And though low-carbon technologies such as solar and wind have scaled impressively over the past decade, many observers are concerned that all this new demand will provide “a lifeline for more fossil fuel production,” as Senator Martin Heinrich put it.

In response, a few policy entrepreneurs have proposed novel regulations known as “additionality” requirements to handle new sources of electric load. First suggested for electrolytic hydrogen, additionality standards would require that subsidized hydrogen producers source their electricity directly from newly built low-carbon power plants; in a Heatmap piece from September, Brian Deese and Lisa Hansmann proposed similar requirements for new artificial intelligence. And while AI data centers were their focus, the two argued that additionality “is a model that can be extended to address other sectors facing growing energy demand.”

Keep reading...Show less
Blue
Economy

The Latest Nobel Winner Has a Different Approach to Solving Climate Change

Daron Acemoglu and William Nordhaus have some disagreements.

Daron Acemoglu.
Heatmap Illustration/Wikimedia Commons-MeJudice1, Getty Images

This year’s Economics Nobel is not a climate prize — that happened in 2018, when Yale economist William Nordhaus won the prize for his work on modeling the effects of climate change and economic growth together, providing the intellectual basis for carbon taxation and more generally for regulating greenhouse gas emissions because of the “social cost” they impose on everyone.

Instead, this year’s prize, awarded to MIT’s Daron Acemoglu and Simon Johnson and University of Chicago’s James Robinson is for their work demonstrating “the importance of societal institutions for a country’s prosperity,” i.e. why some countries are rich and others are poor. To do so, the trio looked at the history of those countries’ institutions — laws, modes of government, relationship between the state and individuals — and drew out which are conducive to wealth and which lead to poverty.

Keep reading...Show less
Green
Guides

Should You Trust Zillow’s Climate Risk Data?

It’s flawed, but not worthless. Here’s how you should think about it.

The Zillow logo in disasters.
Heatmap Illustration/Getty Images

Starting this month, the tens of millions of Americans who browse the real-estate listings website Zillow will encounter a new type of information.

In addition to disclosing a home’s square footage, school district, and walkability score, Zillow will begin to tell users about its climate risk — the chance that a major weather or climate event will strike in the next 30 years. It will focus on the risk from five types of dangers: floods, wildfires, high winds, heat, and air quality.

Keep reading...Show less