Sign In or Create an Account.

By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy

Podcast

A Skeptic’s Take on AI and Energy Growth

Inside episode 10 of Shift Key.

Power lines.
Heatmap Illustration/Getty Images

Will the rise of machine learning and artificial intelligence break the climate system? In recent months, utilities and tech companies have argued that soaring use of AI will overwhelm electricity markets. Is that true — or is it a sales pitch meant to build more gas plants? And how much electricity do data centers and AI use today?

In this week’s episode, Rob and Jesse talk to Jonathan Koomey, an independent researcher, lecturer, and entrepreneur who studies the energy impacts of the internet and information technology. We discuss why AI may not break the electricity system and the long history of anxiety over computing’s energy use. Shift Key is hosted by Robinson Meyer, executive editor of Heatmap, and Jesse Jenkins, a Princeton professor of energy systems engineering.

Subscribe to “Shift Key” and find this episode on Apple Podcasts, Spotify, Amazon, or wherever you get your podcasts.

You can also add the show’s RSS feed to your podcast app to follow us directly.

Here is an excerpt from our conversation:

Robinson Meyer: Before we go any further — and I think you just hinted at your answer, here, but I want to tackle it directly — which is that I think people look at the hockey stick graphs for AI use, and they look at current energy use for AI, and they look at load growth data coming from the utilities, and they go, “Oh my gosh, AI is going to absolutely overrun our energy system. It’s going to cause emissions to shoot up,” because again, this is just extrapolating from what’s recent.

But of course, part of the whole AI mythos is like, once it starts, you can’t stop it. There is a story out there that, frankly, you see as much from folks who are worried about the climate as you do from AI boosters, which is that very soon, we'’e going to be using a huge amount of energy on AI. And I want to ask you this directly: Should we be worried about AI, number one, overrunning the energy system? Or number two, AI causing a massive spike in carbon emissions that dooms us to, let's say, pass 2.5C that uses up the rest of our carbon budget? Is that something you're worried about? And just how do you think about this?

Jonathan Koomey: Everyone needs to calm the heck down. So we talked about the original baseline, right? So the baseline, data centers are 1% of the world's electricity. And maybe AI now is 0.1%, right? For Google, it’s 0.15%, whatever. But 10% of the 1% is AI.

So let’s say that doubles — let’s say that triples in the next few years, or even goes up fivefold. That gets to about half a percent. So I think it will pale in comparison to the other growth drivers that Jesse was talking about in electrification. Because if you think about light vehicles, if you electrified all light vehicles in the U.S., that’s like a 20% or 25% increase in electricity consumption. And if you did that over 20 years, that’s like 1-ish% per year. Right? So that's, that to me is a very credible thing that’s likely to happen. And then when you add heat pumps, you add industrial electrification, a lot more.

I think there will be local impacts. There will be some places where AI and data centers more generally will be important and will drive load growth, but it is not a national story. It is a local story. And so a place like Ireland that has, I think at last count 17%, 18% of its load from data centers, if that grows, that could give them real challenges. Same thing, Loudoun County in Virginia. But you really do have to separate the national story or the global story from the local story.

Jesse Jenkins: I think it was just about a week ago, Nvidia which is the leading producer of the graphics processing units that have become now the main workhorse chips for generative AI computing, they released their new best-in-class chip. And as they revealed that chip, they — for the first time, it sounded like — started to emphasize the energy efficiency improvements of the GPU. And the basic story the CEO told is that it would take about 73% less electricity and a shorter period of time to train AIs on this new chip than it did on their previous best-in-class chip. So that’s just one generation of GPU with nearly three-quarters reduction in the amount of energy consumed per ... I don't know how you measure the units of large language model training, but per smarts trained into generative AI. So yeah, huge gains.

And one might say, well, can that continue forever? And I guess we should maybe get your thoughts on that. But it has continued at least for the last 10 to 20 years. And so there’s a lot of reason to believe that there’s continued gains to be made.

Koomey: Most people, when they think of efficiency, they think of Moore’s Law. They think of shrinking transistors. And anyone who follows this knows that every year or two, there’s another article about how Moore’s Law is ending, or slowing, or you know, it’s getting harder. And there’s no question about it, it’s absolutely getting harder and harder to shrink the transistors. But it turns out shrinking transistors is only one way to improve efficiency and performance. For a long time, the industry relied on that.

From the early days of microprocessors, starting in ’71, over time, they would ramp up the clock speed. And at the same time, they would ramp down the voltage of the chip. And that was called Dennard scaling. It allowed them to keep ramping up performance without getting to crazy levels of leakage current and heat and melting the chip and the whole thing. That worked for a long time, til the early 2000s. And then they hit the threshold voltage for silicon, which is like one volt. So once you hit that, you can no longer do that trick. And they needed new tricks.

So what they did was they, most of you remember who were around at that time, there was this big shift to multiple cores on a chip. That was an innovation in hardware architecture that allowed them, for a time, to improve efficiency by going to software that could run on multiple cores, so you could multiprocess various activities. So that’s one way you can improve things. You can also work on the software — you can improve the efficiency of the software, you can improve the algorithms that you use.

So even if Moore's law shrinkage of transistors stops, which it hasn’t fully stopped. But even if it did, there are a lot of other things we can do. And AI in particular is relatively new. Basically, people threw a whole bunch of money at existing processors because there was this rush to deploy technology. But now, everyone’s stepping back and saying, well, look at the cost of the energy cost and the infrastructure cost. Is there a way to do this better? And sure, there definitely is, and Nvidia proved it in their presentation that you referred to.

This episode of Shift Key is sponsored by…

KORE Power provides the commercial, industrial, and utility markets with functional solutions that advance the clean energy transition worldwide. KORE Power's technology and manufacturing capabilities provide direct access to next generation battery cells, energy storage systems that scale to grid+, EV power & infrastructure, and intuitive asset management to unlock energy strategies across a myriad of applications. Explore more at korepower.com.

Watershed's climate data engine helps companies measure and reduce their emissions, turning the data they already have into an audit-ready carbon footprint backed by the latest climate science. Get the sustainability data you need in weeks, not months. Learn more at watershed.com.

Music for Shift Key is by Adam Kromelow.

Green

You’re out of free articles.

Subscribe today to experience Heatmap’s expert analysis 
of climate change, clean energy, and sustainability.
To continue reading
Create a free account or sign in to unlock more free articles.
or
Please enter an email address
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Energy

China Is Making a Major Offshore Wind Push in Europe

It’s already conquered solar, batteries, and EVs. With a $2 billion new turbine factory in Scotland, it may have set its next target.

A Chinese flag and a wind turbine.
Heatmap Illustration/Getty Images

Batteries, solar panels, electric vehicles. The story of renewable energy deployment globally is increasingly one of China’s fiercely competitive domestic industries and deep supply chains exporting their immense capacity globally. Now, it may be wind’s turn.

The Chinese turbine manufacturer Ming Yang announced last week that it plans to invest $2 billion in a factory in Scotland. The facility is scheduled to start production in late 2028, churning out offshore wind equipment for use in the United Kingdom, which has over 15 gigawatts of offshore wind capacity, as well as for export, likely in Europe.

Keep reading...Show less
AM Briefing

The Firings Begin

On Interior’s denial, ethane exports surge, and Spain’s grid fears

The Department of Energy.
Heatmap Illustration/Getty Images

Current conditions: A major Pacific storm is drenching California and bringing several inches of snow to Montana, Idaho, and Wyoming • A tropical storm in the Atlantic dumped nearly a foot of water on South Carolina over three days • Algeria is roasting in temperatures of more than 105 degrees Fahrenheit.

THE TOP FIVE

1. Energy Department starts firing workers amid shutdown

The Department of Energy notified workers in multiple offices Friday that they were likely to be fired or reassigned to another part of the agency, E&E News reported Tuesday. Staffers at the Office of Clean Energy Demonstrations and the Office of State and Community Energy Programs received notices stating that the offices would “be undergoing a major reorganization and your position may be reassigned to another organization, transferred to another function or abolished.” Still, the notice said “no determination has been made concerning your specific position” just yet.

Keep reading...Show less
Red
Podcast

How Julian Brave NoiseCat Changed His Mind About Climate Politics

Rob talks with the author and activist about his new book, We Survived the Night.

Julian Brave NoiseCat.
Heatmap Illustration/Getty Images

Julian Brave NoiseCat is a writer, Oscar-nominated filmmaker, champion powwow dancer, and student of Salish art and history. His first book, We Survived the Night, was released this week — it uses memoir, reporting, and literary anthology to tell the story of Native families across North America, including his own.

NoiseCat was previously an environmental and climate activist at groups including 350.org and Data for Progress. On this week’s episode of Shift Key, Rob talks with Julian about Native American nations and politics, the complexity and reality of Native life in 2025, and the “trickster” as a recurring political archetype.

Keep reading...Show less
Green