Sign In or Create an Account.

By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy

Podcast

A Skeptic’s Take on AI and Energy Growth

Inside episode 10 of Shift Key.

Power lines.
Heatmap Illustration/Getty Images

Will the rise of machine learning and artificial intelligence break the climate system? In recent months, utilities and tech companies have argued that soaring use of AI will overwhelm electricity markets. Is that true — or is it a sales pitch meant to build more gas plants? And how much electricity do data centers and AI use today?

In this week’s episode, Rob and Jesse talk to Jonathan Koomey, an independent researcher, lecturer, and entrepreneur who studies the energy impacts of the internet and information technology. We discuss why AI may not break the electricity system and the long history of anxiety over computing’s energy use. Shift Key is hosted by Robinson Meyer, executive editor of Heatmap, and Jesse Jenkins, a Princeton professor of energy systems engineering.

Subscribe to “Shift Key” and find this episode on Apple Podcasts, Spotify, Amazon, or wherever you get your podcasts.

You can also add the show’s RSS feed to your podcast app to follow us directly.

Here is an excerpt from our conversation:

Robinson Meyer: Before we go any further — and I think you just hinted at your answer, here, but I want to tackle it directly — which is that I think people look at the hockey stick graphs for AI use, and they look at current energy use for AI, and they look at load growth data coming from the utilities, and they go, “Oh my gosh, AI is going to absolutely overrun our energy system. It’s going to cause emissions to shoot up,” because again, this is just extrapolating from what’s recent.

But of course, part of the whole AI mythos is like, once it starts, you can’t stop it. There is a story out there that, frankly, you see as much from folks who are worried about the climate as you do from AI boosters, which is that very soon, we'’e going to be using a huge amount of energy on AI. And I want to ask you this directly: Should we be worried about AI, number one, overrunning the energy system? Or number two, AI causing a massive spike in carbon emissions that dooms us to, let's say, pass 2.5C that uses up the rest of our carbon budget? Is that something you're worried about? And just how do you think about this?

Jonathan Koomey: Everyone needs to calm the heck down. So we talked about the original baseline, right? So the baseline, data centers are 1% of the world's electricity. And maybe AI now is 0.1%, right? For Google, it’s 0.15%, whatever. But 10% of the 1% is AI.

So let’s say that doubles — let’s say that triples in the next few years, or even goes up fivefold. That gets to about half a percent. So I think it will pale in comparison to the other growth drivers that Jesse was talking about in electrification. Because if you think about light vehicles, if you electrified all light vehicles in the U.S., that’s like a 20% or 25% increase in electricity consumption. And if you did that over 20 years, that’s like 1-ish% per year. Right? So that's, that to me is a very credible thing that’s likely to happen. And then when you add heat pumps, you add industrial electrification, a lot more.

I think there will be local impacts. There will be some places where AI and data centers more generally will be important and will drive load growth, but it is not a national story. It is a local story. And so a place like Ireland that has, I think at last count 17%, 18% of its load from data centers, if that grows, that could give them real challenges. Same thing, Loudoun County in Virginia. But you really do have to separate the national story or the global story from the local story.

Jesse Jenkins: I think it was just about a week ago, Nvidia which is the leading producer of the graphics processing units that have become now the main workhorse chips for generative AI computing, they released their new best-in-class chip. And as they revealed that chip, they — for the first time, it sounded like — started to emphasize the energy efficiency improvements of the GPU. And the basic story the CEO told is that it would take about 73% less electricity and a shorter period of time to train AIs on this new chip than it did on their previous best-in-class chip. So that’s just one generation of GPU with nearly three-quarters reduction in the amount of energy consumed per ... I don't know how you measure the units of large language model training, but per smarts trained into generative AI. So yeah, huge gains.

And one might say, well, can that continue forever? And I guess we should maybe get your thoughts on that. But it has continued at least for the last 10 to 20 years. And so there’s a lot of reason to believe that there’s continued gains to be made.

Koomey: Most people, when they think of efficiency, they think of Moore’s Law. They think of shrinking transistors. And anyone who follows this knows that every year or two, there’s another article about how Moore’s Law is ending, or slowing, or you know, it’s getting harder. And there’s no question about it, it’s absolutely getting harder and harder to shrink the transistors. But it turns out shrinking transistors is only one way to improve efficiency and performance. For a long time, the industry relied on that.

From the early days of microprocessors, starting in ’71, over time, they would ramp up the clock speed. And at the same time, they would ramp down the voltage of the chip. And that was called Dennard scaling. It allowed them to keep ramping up performance without getting to crazy levels of leakage current and heat and melting the chip and the whole thing. That worked for a long time, til the early 2000s. And then they hit the threshold voltage for silicon, which is like one volt. So once you hit that, you can no longer do that trick. And they needed new tricks.

So what they did was they, most of you remember who were around at that time, there was this big shift to multiple cores on a chip. That was an innovation in hardware architecture that allowed them, for a time, to improve efficiency by going to software that could run on multiple cores, so you could multiprocess various activities. So that’s one way you can improve things. You can also work on the software — you can improve the efficiency of the software, you can improve the algorithms that you use.

So even if Moore's law shrinkage of transistors stops, which it hasn’t fully stopped. But even if it did, there are a lot of other things we can do. And AI in particular is relatively new. Basically, people threw a whole bunch of money at existing processors because there was this rush to deploy technology. But now, everyone’s stepping back and saying, well, look at the cost of the energy cost and the infrastructure cost. Is there a way to do this better? And sure, there definitely is, and Nvidia proved it in their presentation that you referred to.

This episode of Shift Key is sponsored by…

KORE Power provides the commercial, industrial, and utility markets with functional solutions that advance the clean energy transition worldwide. KORE Power's technology and manufacturing capabilities provide direct access to next generation battery cells, energy storage systems that scale to grid+, EV power & infrastructure, and intuitive asset management to unlock energy strategies across a myriad of applications. Explore more at korepower.com.

Watershed's climate data engine helps companies measure and reduce their emissions, turning the data they already have into an audit-ready carbon footprint backed by the latest climate science. Get the sustainability data you need in weeks, not months. Learn more at watershed.com.

Music for Shift Key is by Adam Kromelow.

Green

You’re out of free articles.

Subscribe today to experience Heatmap’s expert analysis 
of climate change, clean energy, and sustainability.
To continue reading
Create a free account or sign in to unlock more free articles.
or
Please enter an email address
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
A balancing act.
Heatmap Illustration/Getty Images

Much of the world is once again asking whether fossil fuels are as reliable as they thought — not because power plants are tripping off or wellheads are freezing up, but because terawatts’ worth of energy are currently stuck outside the Strait of Hormuz in oil tankers and liquified natural gas carriers.

The current crisis in many ways echoes the 2022 energy cataclysm, kicked off when Russia invaded Ukraine. Then, oil, gas, and commodity prices immediately spiked across the globe, forcing Europe to reorient its energy supplies away from Russian gas and leaving developing countries in a state of energy poverty as they could not afford to import suddenly dear fuels.

Keep reading...Show less
Yellow
Climate Tech

Funding Friday: Tom Steyer Makes a Real Estate Play

On Galvanize’s latest fund strategy and more of the week’s big money moves.

A man on a motorcycle.
Heatmap Illustration/Getty Images, Zeno

This week brings encouraging news for companies on land and offshore, from the Netherlands to East Africa. First up — and in spite of a federal administration that appears to be actively hostile toward residential and commercial electrification and energy efficiency measures — California gubernatorial candidate Tom Steyer’s investment firm Galvanize just closed a fund devoted to decarbonizing real estate. Elsewhere, we have a Dutch startup pursuing a novel approach to clean heat production, a former Tesla exec rolling out electric motorbikes in East Africa, and an offshore wind developer plans to pair its floating platform with underwater data centers.

Galvanize Raises $370 Million Fund for Energy-Resilient Real Estate

With electricity costs on the rise and war in Iran pushing energy prices further upward, energy efficiency measures are looking more prudent — and more profitable — than ever. Amidst this backdrop, the asset manager and venture firm Galvanize announced the close of its first real estate fund, bringing in $370 million as the firm looks to make commercial buildings cleaner and better able to weather price fluctuations in global energy markets.

Keep reading...Show less
Green
Q&A

How to Sell Rural America on Data Centers

A conversation with Center for Rural Innovation founder and Vermont hative Matt Dunne.

The Q&A subject.
Heatmap Illustration/Getty Images

This week’s conversation is with Matt Dunne, founder of the nonprofit Center for Rural Innovation, which focuses on technology, social responsibility, and empowering small, economically depressed communities.

Dunne was born and raised in Vermont, where he still lives today. He was a state legislator in the Green Mountain State for many years. I first became familiar with his name when I was in college at the state’s public university, reporting on his candidacy for the Democratic gubernatorial nomination in 2016. Dunne ultimately lost a tight race to Sue Minter, who then lost to current governor Phil Scott, a Republican.

Keep reading...Show less
Yellow