Sign In or Create an Account.

By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy

Podcast

A Skeptic’s Take on AI and Energy Growth

Inside episode 10 of Shift Key.

Power lines.
Heatmap Illustration/Getty Images

Will the rise of machine learning and artificial intelligence break the climate system? In recent months, utilities and tech companies have argued that soaring use of AI will overwhelm electricity markets. Is that true — or is it a sales pitch meant to build more gas plants? And how much electricity do data centers and AI use today?

In this week’s episode, Rob and Jesse talk to Jonathan Koomey, an independent researcher, lecturer, and entrepreneur who studies the energy impacts of the internet and information technology. We discuss why AI may not break the electricity system and the long history of anxiety over computing’s energy use. Shift Key is hosted by Robinson Meyer, executive editor of Heatmap, and Jesse Jenkins, a Princeton professor of energy systems engineering.

Subscribe to “Shift Key” and find this episode on Apple Podcasts, Spotify, Amazon, or wherever you get your podcasts.

You can also add the show’s RSS feed to your podcast app to follow us directly.

Here is an excerpt from our conversation:

Robinson Meyer: Before we go any further — and I think you just hinted at your answer, here, but I want to tackle it directly — which is that I think people look at the hockey stick graphs for AI use, and they look at current energy use for AI, and they look at load growth data coming from the utilities, and they go, “Oh my gosh, AI is going to absolutely overrun our energy system. It’s going to cause emissions to shoot up,” because again, this is just extrapolating from what’s recent.

But of course, part of the whole AI mythos is like, once it starts, you can’t stop it. There is a story out there that, frankly, you see as much from folks who are worried about the climate as you do from AI boosters, which is that very soon, we'’e going to be using a huge amount of energy on AI. And I want to ask you this directly: Should we be worried about AI, number one, overrunning the energy system? Or number two, AI causing a massive spike in carbon emissions that dooms us to, let's say, pass 2.5C that uses up the rest of our carbon budget? Is that something you're worried about? And just how do you think about this?

Jonathan Koomey: Everyone needs to calm the heck down. So we talked about the original baseline, right? So the baseline, data centers are 1% of the world's electricity. And maybe AI now is 0.1%, right? For Google, it’s 0.15%, whatever. But 10% of the 1% is AI.

So let’s say that doubles — let’s say that triples in the next few years, or even goes up fivefold. That gets to about half a percent. So I think it will pale in comparison to the other growth drivers that Jesse was talking about in electrification. Because if you think about light vehicles, if you electrified all light vehicles in the U.S., that’s like a 20% or 25% increase in electricity consumption. And if you did that over 20 years, that’s like 1-ish% per year. Right? So that's, that to me is a very credible thing that’s likely to happen. And then when you add heat pumps, you add industrial electrification, a lot more.

I think there will be local impacts. There will be some places where AI and data centers more generally will be important and will drive load growth, but it is not a national story. It is a local story. And so a place like Ireland that has, I think at last count 17%, 18% of its load from data centers, if that grows, that could give them real challenges. Same thing, Loudoun County in Virginia. But you really do have to separate the national story or the global story from the local story.

Jesse Jenkins: I think it was just about a week ago, Nvidia which is the leading producer of the graphics processing units that have become now the main workhorse chips for generative AI computing, they released their new best-in-class chip. And as they revealed that chip, they — for the first time, it sounded like — started to emphasize the energy efficiency improvements of the GPU. And the basic story the CEO told is that it would take about 73% less electricity and a shorter period of time to train AIs on this new chip than it did on their previous best-in-class chip. So that’s just one generation of GPU with nearly three-quarters reduction in the amount of energy consumed per ... I don't know how you measure the units of large language model training, but per smarts trained into generative AI. So yeah, huge gains.

And one might say, well, can that continue forever? And I guess we should maybe get your thoughts on that. But it has continued at least for the last 10 to 20 years. And so there’s a lot of reason to believe that there’s continued gains to be made.

Koomey: Most people, when they think of efficiency, they think of Moore’s Law. They think of shrinking transistors. And anyone who follows this knows that every year or two, there’s another article about how Moore’s Law is ending, or slowing, or you know, it’s getting harder. And there’s no question about it, it’s absolutely getting harder and harder to shrink the transistors. But it turns out shrinking transistors is only one way to improve efficiency and performance. For a long time, the industry relied on that.

From the early days of microprocessors, starting in ’71, over time, they would ramp up the clock speed. And at the same time, they would ramp down the voltage of the chip. And that was called Dennard scaling. It allowed them to keep ramping up performance without getting to crazy levels of leakage current and heat and melting the chip and the whole thing. That worked for a long time, til the early 2000s. And then they hit the threshold voltage for silicon, which is like one volt. So once you hit that, you can no longer do that trick. And they needed new tricks.

So what they did was they, most of you remember who were around at that time, there was this big shift to multiple cores on a chip. That was an innovation in hardware architecture that allowed them, for a time, to improve efficiency by going to software that could run on multiple cores, so you could multiprocess various activities. So that’s one way you can improve things. You can also work on the software — you can improve the efficiency of the software, you can improve the algorithms that you use.

So even if Moore's law shrinkage of transistors stops, which it hasn’t fully stopped. But even if it did, there are a lot of other things we can do. And AI in particular is relatively new. Basically, people threw a whole bunch of money at existing processors because there was this rush to deploy technology. But now, everyone’s stepping back and saying, well, look at the cost of the energy cost and the infrastructure cost. Is there a way to do this better? And sure, there definitely is, and Nvidia proved it in their presentation that you referred to.

This episode of Shift Key is sponsored by…

KORE Power provides the commercial, industrial, and utility markets with functional solutions that advance the clean energy transition worldwide. KORE Power's technology and manufacturing capabilities provide direct access to next generation battery cells, energy storage systems that scale to grid+, EV power & infrastructure, and intuitive asset management to unlock energy strategies across a myriad of applications. Explore more at korepower.com.

Watershed's climate data engine helps companies measure and reduce their emissions, turning the data they already have into an audit-ready carbon footprint backed by the latest climate science. Get the sustainability data you need in weeks, not months. Learn more at watershed.com.

Music for Shift Key is by Adam Kromelow.

Green
Robinson Meyer profile image

Robinson Meyer

Robinson is the founding executive editor of Heatmap. He was previously a staff writer at The Atlantic, where he covered climate change, energy, and technology. Read More

Read More
Technology

AM Briefing: Blackouts on the Rise

On blackouts, Big Oil, and crowdsourcing for weather disasters

America Has a Growing Power Outage Problem
Heatmap Illustration/Getty Images

Current conditions: Heavy rain in southern Brazil killed at least 10 people • Flood watches are in effect across North Texas • It will be 75 degrees Fahrenheit and sunny today in California’s Berryessa Snow Mountain National Monument, which has just been expanded by 13,700 acres.

THE TOP FIVE

1. One key moment from the Big Oil hearing

Democratic lawmakers testified at a congressional hearing yesterday that Big Oil companies were guilty of decades of “denial, disinformation, and doublespeak” on climate change. The hearing followed the release of damning internal documents suggesting executives from major fossil fuel producers sought to “deceive the public about the enormous climate crisis we are in and the role that Big Oil has played in bringing it about,” said Jamie Raskin, the top Democrat on the House Oversight committee.

Keep reading...Show less
Yellow
Sparks

The Best Idea From Today’s Big Oil Hearing

Stealing a page from the Big Tobacco playbook.

The Capitol.
Heatmap Illustration/Getty Images

It was always a fantasy to think that the Senate Committee on the Budget’s hearing on oil disinformation would actually be about oil disinformation. It was still shocking, though, how far off the rails things ran.

The hearing concerned a report released Tuesday by the committee along with Democrats in the House documenting “the extensive efforts undertaken by fossil fuel companies to deceive the public and investors about their knowledge of the effects of their products on climate change and to undermine efforts to curb greenhouse gas emissions.” This builds on the already extensive literature documenting the fossil fuel industry’s deliberate dissemination of lies about climate change and its role in causing it, including the 2010 book Merchants of Doubt and a 2015 Pulitzer Prize-nominated series from Inside Climate News on Exxon’s climate denial PR machine. But more, of course, is more.

Keep reading...Show less
Throwing carbon in the trash.
Heatmap Illustration/Getty Images

Most climate solutions are getting smarter. Solar panels can track the sun. Electric vehicles are equipped with the equivalent of an iPad and may soon be able to drive themselves (according to some people). Startups are inventing stoves with batteries that charge when energy is cheap and heat pumps that learn how you use your home and adjust accordingly.

But when it comes to permanently removing carbon dioxide from the atmosphere, the market is pushing in a different direction. There, it seems, there’s growing excitement for the dumbest, most primitive solutions companies can come up with.

Keep reading...Show less
Green