Sign In or Create an Account.

By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy

Podcast

A Skeptic’s Take on AI and Energy Growth

Inside episode 10 of Shift Key.

Power lines.
Heatmap Illustration/Getty Images

Will the rise of machine learning and artificial intelligence break the climate system? In recent months, utilities and tech companies have argued that soaring use of AI will overwhelm electricity markets. Is that true — or is it a sales pitch meant to build more gas plants? And how much electricity do data centers and AI use today?

In this week’s episode, Rob and Jesse talk to Jonathan Koomey, an independent researcher, lecturer, and entrepreneur who studies the energy impacts of the internet and information technology. We discuss why AI may not break the electricity system and the long history of anxiety over computing’s energy use. Shift Key is hosted by Robinson Meyer, executive editor of Heatmap, and Jesse Jenkins, a Princeton professor of energy systems engineering.

Subscribe to “Shift Key” and find this episode on Apple Podcasts, Spotify, Amazon, or wherever you get your podcasts.

You can also add the show’s RSS feed to your podcast app to follow us directly.

Here is an excerpt from our conversation:

Robinson Meyer: Before we go any further — and I think you just hinted at your answer, here, but I want to tackle it directly — which is that I think people look at the hockey stick graphs for AI use, and they look at current energy use for AI, and they look at load growth data coming from the utilities, and they go, “Oh my gosh, AI is going to absolutely overrun our energy system. It’s going to cause emissions to shoot up,” because again, this is just extrapolating from what’s recent.

But of course, part of the whole AI mythos is like, once it starts, you can’t stop it. There is a story out there that, frankly, you see as much from folks who are worried about the climate as you do from AI boosters, which is that very soon, we'’e going to be using a huge amount of energy on AI. And I want to ask you this directly: Should we be worried about AI, number one, overrunning the energy system? Or number two, AI causing a massive spike in carbon emissions that dooms us to, let's say, pass 2.5C that uses up the rest of our carbon budget? Is that something you're worried about? And just how do you think about this?

Jonathan Koomey: Everyone needs to calm the heck down. So we talked about the original baseline, right? So the baseline, data centers are 1% of the world's electricity. And maybe AI now is 0.1%, right? For Google, it’s 0.15%, whatever. But 10% of the 1% is AI.

So let’s say that doubles — let’s say that triples in the next few years, or even goes up fivefold. That gets to about half a percent. So I think it will pale in comparison to the other growth drivers that Jesse was talking about in electrification. Because if you think about light vehicles, if you electrified all light vehicles in the U.S., that’s like a 20% or 25% increase in electricity consumption. And if you did that over 20 years, that’s like 1-ish% per year. Right? So that's, that to me is a very credible thing that’s likely to happen. And then when you add heat pumps, you add industrial electrification, a lot more.

I think there will be local impacts. There will be some places where AI and data centers more generally will be important and will drive load growth, but it is not a national story. It is a local story. And so a place like Ireland that has, I think at last count 17%, 18% of its load from data centers, if that grows, that could give them real challenges. Same thing, Loudoun County in Virginia. But you really do have to separate the national story or the global story from the local story.

Jesse Jenkins: I think it was just about a week ago, Nvidia which is the leading producer of the graphics processing units that have become now the main workhorse chips for generative AI computing, they released their new best-in-class chip. And as they revealed that chip, they — for the first time, it sounded like — started to emphasize the energy efficiency improvements of the GPU. And the basic story the CEO told is that it would take about 73% less electricity and a shorter period of time to train AIs on this new chip than it did on their previous best-in-class chip. So that’s just one generation of GPU with nearly three-quarters reduction in the amount of energy consumed per ... I don't know how you measure the units of large language model training, but per smarts trained into generative AI. So yeah, huge gains.

And one might say, well, can that continue forever? And I guess we should maybe get your thoughts on that. But it has continued at least for the last 10 to 20 years. And so there’s a lot of reason to believe that there’s continued gains to be made.

Koomey: Most people, when they think of efficiency, they think of Moore’s Law. They think of shrinking transistors. And anyone who follows this knows that every year or two, there’s another article about how Moore’s Law is ending, or slowing, or you know, it’s getting harder. And there’s no question about it, it’s absolutely getting harder and harder to shrink the transistors. But it turns out shrinking transistors is only one way to improve efficiency and performance. For a long time, the industry relied on that.

From the early days of microprocessors, starting in ’71, over time, they would ramp up the clock speed. And at the same time, they would ramp down the voltage of the chip. And that was called Dennard scaling. It allowed them to keep ramping up performance without getting to crazy levels of leakage current and heat and melting the chip and the whole thing. That worked for a long time, til the early 2000s. And then they hit the threshold voltage for silicon, which is like one volt. So once you hit that, you can no longer do that trick. And they needed new tricks.

So what they did was they, most of you remember who were around at that time, there was this big shift to multiple cores on a chip. That was an innovation in hardware architecture that allowed them, for a time, to improve efficiency by going to software that could run on multiple cores, so you could multiprocess various activities. So that’s one way you can improve things. You can also work on the software — you can improve the efficiency of the software, you can improve the algorithms that you use.

So even if Moore's law shrinkage of transistors stops, which it hasn’t fully stopped. But even if it did, there are a lot of other things we can do. And AI in particular is relatively new. Basically, people threw a whole bunch of money at existing processors because there was this rush to deploy technology. But now, everyone’s stepping back and saying, well, look at the cost of the energy cost and the infrastructure cost. Is there a way to do this better? And sure, there definitely is, and Nvidia proved it in their presentation that you referred to.

This episode of Shift Key is sponsored by…

KORE Power provides the commercial, industrial, and utility markets with functional solutions that advance the clean energy transition worldwide. KORE Power's technology and manufacturing capabilities provide direct access to next generation battery cells, energy storage systems that scale to grid+, EV power & infrastructure, and intuitive asset management to unlock energy strategies across a myriad of applications. Explore more at korepower.com.

Watershed's climate data engine helps companies measure and reduce their emissions, turning the data they already have into an audit-ready carbon footprint backed by the latest climate science. Get the sustainability data you need in weeks, not months. Learn more at watershed.com.

Music for Shift Key is by Adam Kromelow.

Green
Robinson Meyer profile image

Robinson Meyer

Robinson is the founding executive editor of Heatmap. He was previously a staff writer at The Atlantic, where he covered climate change, energy, and technology.

Jesse D. Jenkins profile image

Jesse D. Jenkins

Jesse D. Jenkins is an assistant professor and expert in energy systems engineering and policy at Princeton University where he leads the REPEAT Project, which provides regular, timely environmental and economic evaluation of federal energy and climate policies as they’re proposed and enacted.

Climate

AM Briefing: North America Ablaze

On the Park Fire, coastal climate resilience, and flight delays

Wildfire Season Is Already Devastating North America
Heatmap Illustration/Getty Images

Current conditions: Eastern Bolivia declared an extreme weather state of emergency through the end of the year • The Chinese province of Fujian has recorded 1.6 feet of rain since Wednesday • Rain in Paris is threatening to make for a soggy Olympics opening ceremony.

THE TOP FIVE

1. Huge wildfires burn in Canada, California, Oregon

Massive wildfires are burning in western states and in Canada, sending plumes of smoke fanning out across the U.S. Triple-digit heat has fueled the fire conditions, but some cooler weather is expected over the weekend.

Keep reading...Show less
Yellow
Politics

Trump Is Onto Something About the Green New Deal

It’s the law in everything but name.

Biden pointing at the Earth.
Illustration by Simon Abranowicz

“They’ve spent trillions of dollars on things having to do with the Green New Scam. It’s a scam,” said Donald Trump in his recent convention speech. His running mate J.D. Vance echoed the sentiment, saying in his speech that the country needs “a leader who rejects Joe Biden and Kamala Harris’s Green New Scam.”

To get the reference, you would have had to understand that they were talking about the Green New Deal — which most Americans probably recall dimly, if at all — and have some sense of both what was in it and why you shouldn’t like it. Neither Trump nor Vance explained or elaborated; it was one of many attacks at the Republican convention that brought cheers from the delegates but were likely all but incomprehensible to voters who aren’t deeply versed in conservative memes and boogeymen.

Keep reading...Show less
Blue
A person in a tie.
Illustration by Simon Abranowicz

Plenty has changed in the race for the U.S. presidency over the past week. One thing that hasn’t: Gobs of public and private funding for climate tech are still on the line. If Republicans regain the White House and Senate, tax credits and other programs in the Inflation Reduction Act will become an easy target for legislators looking to burnish their cost-cutting (and lib-owning) reputations. The effects of key provisions getting either completely tossed or seriously amended would assuredly ripple out to the private sector.

You would think the possible impending loss of a huge source of funding for clean technologies would make venture capitalists worry about the future of their business model. And indeed, they are worried — at least in theory. None of the clean tech investors I’ve spoken with over the past few weeks told me that a Republican administration would affect the way their firm invests — not Lowercarbon Capital, not Breakthrough Energy Ventures, not Khosla Ventures, or any of the VCs with uplifting verbs: Galvanize Climate Solutions, Generate Capital, and Energize Capital.

Keep reading...Show less