Sign In or Create an Account.

By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy

Podcast

A Skeptic’s Take on AI and Energy Growth

Inside episode 10 of Shift Key.

Power lines.
Heatmap Illustration/Getty Images

Will the rise of machine learning and artificial intelligence break the climate system? In recent months, utilities and tech companies have argued that soaring use of AI will overwhelm electricity markets. Is that true — or is it a sales pitch meant to build more gas plants? And how much electricity do data centers and AI use today?

In this week’s episode, Rob and Jesse talk to Jonathan Koomey, an independent researcher, lecturer, and entrepreneur who studies the energy impacts of the internet and information technology. We discuss why AI may not break the electricity system and the long history of anxiety over computing’s energy use. Shift Key is hosted by Robinson Meyer, executive editor of Heatmap, and Jesse Jenkins, a Princeton professor of energy systems engineering.

Subscribe to “Shift Key” and find this episode on Apple Podcasts, Spotify, Amazon, or wherever you get your podcasts.

You can also add the show’s RSS feed to your podcast app to follow us directly.

Here is an excerpt from our conversation:

Robinson Meyer: Before we go any further — and I think you just hinted at your answer, here, but I want to tackle it directly — which is that I think people look at the hockey stick graphs for AI use, and they look at current energy use for AI, and they look at load growth data coming from the utilities, and they go, “Oh my gosh, AI is going to absolutely overrun our energy system. It’s going to cause emissions to shoot up,” because again, this is just extrapolating from what’s recent.

But of course, part of the whole AI mythos is like, once it starts, you can’t stop it. There is a story out there that, frankly, you see as much from folks who are worried about the climate as you do from AI boosters, which is that very soon, we'’e going to be using a huge amount of energy on AI. And I want to ask you this directly: Should we be worried about AI, number one, overrunning the energy system? Or number two, AI causing a massive spike in carbon emissions that dooms us to, let's say, pass 2.5C that uses up the rest of our carbon budget? Is that something you're worried about? And just how do you think about this?

Jonathan Koomey: Everyone needs to calm the heck down. So we talked about the original baseline, right? So the baseline, data centers are 1% of the world's electricity. And maybe AI now is 0.1%, right? For Google, it’s 0.15%, whatever. But 10% of the 1% is AI.

So let’s say that doubles — let’s say that triples in the next few years, or even goes up fivefold. That gets to about half a percent. So I think it will pale in comparison to the other growth drivers that Jesse was talking about in electrification. Because if you think about light vehicles, if you electrified all light vehicles in the U.S., that’s like a 20% or 25% increase in electricity consumption. And if you did that over 20 years, that’s like 1-ish% per year. Right? So that's, that to me is a very credible thing that’s likely to happen. And then when you add heat pumps, you add industrial electrification, a lot more.

I think there will be local impacts. There will be some places where AI and data centers more generally will be important and will drive load growth, but it is not a national story. It is a local story. And so a place like Ireland that has, I think at last count 17%, 18% of its load from data centers, if that grows, that could give them real challenges. Same thing, Loudoun County in Virginia. But you really do have to separate the national story or the global story from the local story.

Jesse Jenkins: I think it was just about a week ago, Nvidia which is the leading producer of the graphics processing units that have become now the main workhorse chips for generative AI computing, they released their new best-in-class chip. And as they revealed that chip, they — for the first time, it sounded like — started to emphasize the energy efficiency improvements of the GPU. And the basic story the CEO told is that it would take about 73% less electricity and a shorter period of time to train AIs on this new chip than it did on their previous best-in-class chip. So that’s just one generation of GPU with nearly three-quarters reduction in the amount of energy consumed per ... I don't know how you measure the units of large language model training, but per smarts trained into generative AI. So yeah, huge gains.

And one might say, well, can that continue forever? And I guess we should maybe get your thoughts on that. But it has continued at least for the last 10 to 20 years. And so there’s a lot of reason to believe that there’s continued gains to be made.

Koomey: Most people, when they think of efficiency, they think of Moore’s Law. They think of shrinking transistors. And anyone who follows this knows that every year or two, there’s another article about how Moore’s Law is ending, or slowing, or you know, it’s getting harder. And there’s no question about it, it’s absolutely getting harder and harder to shrink the transistors. But it turns out shrinking transistors is only one way to improve efficiency and performance. For a long time, the industry relied on that.

From the early days of microprocessors, starting in ’71, over time, they would ramp up the clock speed. And at the same time, they would ramp down the voltage of the chip. And that was called Dennard scaling. It allowed them to keep ramping up performance without getting to crazy levels of leakage current and heat and melting the chip and the whole thing. That worked for a long time, til the early 2000s. And then they hit the threshold voltage for silicon, which is like one volt. So once you hit that, you can no longer do that trick. And they needed new tricks.

So what they did was they, most of you remember who were around at that time, there was this big shift to multiple cores on a chip. That was an innovation in hardware architecture that allowed them, for a time, to improve efficiency by going to software that could run on multiple cores, so you could multiprocess various activities. So that’s one way you can improve things. You can also work on the software — you can improve the efficiency of the software, you can improve the algorithms that you use.

So even if Moore's law shrinkage of transistors stops, which it hasn’t fully stopped. But even if it did, there are a lot of other things we can do. And AI in particular is relatively new. Basically, people threw a whole bunch of money at existing processors because there was this rush to deploy technology. But now, everyone’s stepping back and saying, well, look at the cost of the energy cost and the infrastructure cost. Is there a way to do this better? And sure, there definitely is, and Nvidia proved it in their presentation that you referred to.

This episode of Shift Key is sponsored by…

KORE Power provides the commercial, industrial, and utility markets with functional solutions that advance the clean energy transition worldwide. KORE Power's technology and manufacturing capabilities provide direct access to next generation battery cells, energy storage systems that scale to grid+, EV power & infrastructure, and intuitive asset management to unlock energy strategies across a myriad of applications. Explore more at korepower.com.

Watershed's climate data engine helps companies measure and reduce their emissions, turning the data they already have into an audit-ready carbon footprint backed by the latest climate science. Get the sustainability data you need in weeks, not months. Learn more at watershed.com.

Music for Shift Key is by Adam Kromelow.

Green

You’re out of free articles.

Subscribe today to experience Heatmap’s expert analysis 
of climate change, clean energy, and sustainability.
To continue reading
Create a free account or sign in to unlock more free articles.
or
Please enter an email address
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Politics

How the One Big Beautiful Bill Act Changed America’s Clean Energy Policy

From the Inflation Reduction Act to the Trump mega-law, here are 20 years of changes in one easy-to-read cheat sheet.

Scribbling over renewable energy
Justin Renteria/Getty Images

The landmark Republican reconciliation bill, which President Trump signed on July 4, has shattered the tax credits that served as the centerpiece of the country’s clean energy and climate policy.

Starting as soon as October, the law — which Trump has dubbed the One Big Beautiful Bill Act — will cut off incentives for Americans to install solar panels, purchase electric vehicles, or make energy efficiency improvements to their homes. It’s projected to raise household energy costs while increasing America’s carbon emissions by 190 million metric tons a year by 2030, according to the REPEAT Project at Princeton University.

Keep reading...Show less
Climate

AM Briefing: Trump Grants Regulatory Break to Coal Plants

On presidential proclamations, Pentagon pollution, and cancelled transmission

Trump Grants Regulatory Break to Coal Plants, Iron Ore Processing Facilities
Heatmap Illustration/Getty Images

Current conditions: Over 1,000 people have evacuated the region of Seosan in South Korea following its heaviest rainfall since 1904Forecasts now point toward the “surprising return” of La Niña this fallMore than 30 million people from Louisiana through the Appalachians are at risk of flash flooding this weekend due to an incoming tropical rainstorm.

THE TOP FIVE

1. Trump signs proclamations granting regulatory breaks to coal plants, iron processing facilities

  The Hugh L. Spurlock Generating Station in Maysville, Kentucky.Jeff Swensen/Getty Images

Keep reading...Show less
Yellow
Spotlight

The Moss Landing Battery Backlash Has Spread Nationwide

New York City may very well be the epicenter of this particular fight.

Moss Landing.
Heatmap Illustration/Getty Images, Library of Congress

It’s official: the Moss Landing battery fire has galvanized a gigantic pipeline of opposition to energy storage systems across the country.

As I’ve chronicled extensively throughout this year, Moss Landing was a technological outlier that used outdated battery technology. But the January incident played into existing fears and anxieties across the U.S. about the dangers of large battery fires generally, latent from years of e-scooters and cellphones ablaze from faulty lithium-ion tech. Concerned residents fighting projects in their backyards have successfully seized upon the fact that there’s no known way to quickly extinguish big fires at energy storage sites, and are winning particularly in wildfire-prone areas.

Keep reading...Show less
Yellow