Sparks
Air Quality Data for the Rich
Wealth bias shows up in the strangest places — including, according to new research, PurpleAir sensor data.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
Subscribe to get unlimited Access
Hey , you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
Wealth bias shows up in the strangest places — including, according to new research, PurpleAir sensor data.
On low expectations, global EV demand, and heat domes
One of the biggest names in direct air capture is now selling other companies’ credits.
The first Quilt units will be available to San Franciscans in just a few weeks.
That’s how much the U.S. should be spending per year by 2050 to achieve net zero, according to a new Rhodium Group report.
Money seems to be pouring into the field of carbon removal from every direction. Every other week there’s an announcement about a new project. Multimillion dollar carbon removal procurement deals are on the rise. The Department of Energy is rolling out grants as part of its $3.5 billion “direct air capture” hubs program and also funding research and development. Some carbon removal companies can even start claiming a $130 tax credit for every ton of CO2 they suck up and store underground.
The federal government alone spends just under $1 billion per year on carbon removal research, development, and deployment. According to a new report from the Rhodium Group, however, the U.S. is going to have to spend a lot more — roughly $100 billion per year by 2050 — if carbon dioxide removal, or CDR, is ever going to become a viable climate solution.
“The current level of policy support is nowhere near what's needed for CDR to play the role that people say it needs to play in solving climate change,” Jonathan Larsen, one of the authors, told me. “We wanted to reset the policy conversation with that in mind.”
Carbon removal is what’s implied by the “net” in net-zero — a way to compensate for whatever polluting activities are going to take longer to replace with clean solutions. It will be impossible to achieve net-zero emissions by 2050, either at the national or global level, without removing carbon from the atmosphere. But how much carbon removal will we need, and how do we make sure we’re ready to deploy it?
These questions are, in a sense, unique to the field. When we talk about cutting carbon emissions from buildings or transportation, experts are relatively confident in the set of solutions and the scale of the task — they know how many buildings and cars there are and can make reasonable estimates of growth rates.
But carbon removal is a moving target. We know how much we’re removing today — roughly 5 million metric tons, mostly from nature-based solutions like planting trees. Based on current policies, Rhodium estimates we could scale that up to about 50 million metric tons by 2035. But figuring out how much we need depends entirely on how successful we are at decarbonizing everything else. Even if we know we need to electrify all our cars, for example, no one can say whether that will happen by 2050, or at least not with any meaningful degree of certainty.
The Rhodium Group report attempts to narrow the range of this uncertainty so that policymakers can better attack the problem. The authors looked at a handful of different decarbonization roadmaps for the U.S. and found that the minimum amount of carbon removal needed to compensate for residual emissions in 2050 is 1 gigaton, which is the same as one billion metric tons, or a 20x increase from where current policies will get us. It's also equal to about 20% of the carbon that the U.S. emitted last year. “There's a very likely scenario where we need a lot more than that,” said Larsen. “There's scenarios where we need less. But most of the studies out there say at least a gigaton.”
Even if it’s only a rough estimate, landing on a number is useful, he told me. Rhodium Group spends a lot of time answering questions about, for example, what some new policy means for achieving Biden’s goal of cutting emissions in half by 2030. “I don't know if we’d get those questions if there wasn't a 50% target to shoot for,” he said. “So I think this way, people can be like, what does this next wave of policy support for CDR do for getting the U.S. on track for a gigaton?”
The level of investment it will take to get there is also highly uncertain. The authors did a quick back-of-the-envelope calculation to land on $100 billion by 2050: We need to be removing a minimum of one billion tons by then, and the Department of Energy has a goal to bring the cost of carbon removal down to $100 per ton.
The meat of the new report focuses on how to bridge the gap between the roughly $1 billion we spend today and $100 billion, which starts, according to the authors, with treating carbon removal as a public service. It’s not like other climate solutions such as wind turbines or heat pumps, they write, which can rely on private markets to provide predictable demand or to stimulate innovation. “There are very few pathways one can envision where the private sector is going to both scale and deliver those tons,” Larsen told me. Voluntary carbon removal purchases by companies could play a role, he said, but it will not be big enough to get to a gigaton.
Rhodium recommends expanding and extending many of the federal policy programs that already exist — by, for example, providing more R&D funding, doing more government procurement, handing out more loan guarantees, and creating more “hubs” centered on other approaches besides direct air capture, like enhanced weathering or biomass burial. Right now, the tax credit for capturing carbon from the air and burying it underground can only be claimed for 12 years, and projects have to start construction by 2032. The authors call for extending the claim period and moving up the construction start deadline. They also recommend expanding the program to apply to a wider range of carbon removal methods.
A common criticism of government support for carbon removal is that policy makers will over-rely on it. If we aim to do 1 gigaton of carbon removal, does that mean we won’t cut emissions as much as we could have? What happens if, for whatever reason, we can’t achieve the 1 gigaton?
Larsen disagreed with that framing. For one, it’s easy to turn it around: If we don’t scale up the capacity to remove carbon, and we also don’t eliminate emissions by mid-century, we’re not even going to have the option to halt climate change at that point.
But also, decarbonization shouldn’t stop in 2050, he said. If we can achieve that 1 gigaton of annual removal and then keep cutting emissions from remaining sources, we could eventually get to net-negative emissions — even without more CDR. In other words, if we reach a point where we’re removing more than we’re emitting, we could start to reverse global warming, not just stop it.
“I know that's, like, sci-fi,” he told me. “But that's ultimately where we as a species have to go and that’s why setting a target here of at least a gigaton, to me, does not take away the need to reduce elsewhere.”
Will the rise of machine learning and artificial intelligence break the climate system? In recent months, utilities and tech companies have argued that soaring use of AI will overwhelm electricity markets. Is that true — or is it a sales pitch meant to build more gas plants? And how much electricity do data centers and AI use today?
In this week’s episode, Rob and Jesse talk to Jonathan Koomey, an independent researcher, lecturer, and entrepreneur who studies the energy impacts of the internet and information technology. We discuss why AI may not break the electricity system and the long history of anxiety over computing’s energy use. Shift Key is hosted by Robinson Meyer, executive editor of Heatmap, and Jesse Jenkins, a Princeton professor of energy systems engineering.
Subscribe to “Shift Key” and find this episode on Apple Podcasts, Spotify, Amazon, or wherever you get your podcasts.
You can also add the show’s RSS feed to your podcast app to follow us directly.
Here is an excerpt from our conversation:
Robinson Meyer: Before we go any further — and I think you just hinted at your answer, here, but I want to tackle it directly — which is that I think people look at the hockey stick graphs for AI use, and they look at current energy use for AI, and they look at load growth data coming from the utilities, and they go, “Oh my gosh, AI is going to absolutely overrun our energy system. It’s going to cause emissions to shoot up,” because again, this is just extrapolating from what’s recent.
But of course, part of the whole AI mythos is like, once it starts, you can’t stop it. There is a story out there that, frankly, you see as much from folks who are worried about the climate as you do from AI boosters, which is that very soon, we'’e going to be using a huge amount of energy on AI. And I want to ask you this directly: Should we be worried about AI, number one, overrunning the energy system? Or number two, AI causing a massive spike in carbon emissions that dooms us to, let's say, pass 2.5C that uses up the rest of our carbon budget? Is that something you're worried about? And just how do you think about this?
Jonathan Koomey: Everyone needs to calm the heck down. So we talked about the original baseline, right? So the baseline, data centers are 1% of the world's electricity. And maybe AI now is 0.1%, right? For Google, it’s 0.15%, whatever. But 10% of the 1% is AI.
So let’s say that doubles — let’s say that triples in the next few years, or even goes up fivefold. That gets to about half a percent. So I think it will pale in comparison to the other growth drivers that Jesse was talking about in electrification. Because if you think about light vehicles, if you electrified all light vehicles in the U.S., that’s like a 20% or 25% increase in electricity consumption. And if you did that over 20 years, that’s like 1-ish% per year. Right? So that's, that to me is a very credible thing that’s likely to happen. And then when you add heat pumps, you add industrial electrification, a lot more.
I think there will be local impacts. There will be some places where AI and data centers more generally will be important and will drive load growth, but it is not a national story. It is a local story. And so a place like Ireland that has, I think at last count 17%, 18% of its load from data centers, if that grows, that could give them real challenges. Same thing, Loudoun County in Virginia. But you really do have to separate the national story or the global story from the local story.
Jesse Jenkins: I think it was just about a week ago, Nvidia which is the leading producer of the graphics processing units that have become now the main workhorse chips for generative AI computing, they released their new best-in-class chip. And as they revealed that chip, they — for the first time, it sounded like — started to emphasize the energy efficiency improvements of the GPU. And the basic story the CEO told is that it would take about 73% less electricity and a shorter period of time to train AIs on this new chip than it did on their previous best-in-class chip. So that’s just one generation of GPU with nearly three-quarters reduction in the amount of energy consumed per ... I don't know how you measure the units of large language model training, but per smarts trained into generative AI. So yeah, huge gains.
And one might say, well, can that continue forever? And I guess we should maybe get your thoughts on that. But it has continued at least for the last 10 to 20 years. And so there’s a lot of reason to believe that there’s continued gains to be made.
Koomey: Most people, when they think of efficiency, they think of Moore’s Law. They think of shrinking transistors. And anyone who follows this knows that every year or two, there’s another article about how Moore’s Law is ending, or slowing, or you know, it’s getting harder. And there’s no question about it, it’s absolutely getting harder and harder to shrink the transistors. But it turns out shrinking transistors is only one way to improve efficiency and performance. For a long time, the industry relied on that.
From the early days of microprocessors, starting in ’71, over time, they would ramp up the clock speed. And at the same time, they would ramp down the voltage of the chip. And that was called Dennard scaling. It allowed them to keep ramping up performance without getting to crazy levels of leakage current and heat and melting the chip and the whole thing. That worked for a long time, til the early 2000s. And then they hit the threshold voltage for silicon, which is like one volt. So once you hit that, you can no longer do that trick. And they needed new tricks.
So what they did was they, most of you remember who were around at that time, there was this big shift to multiple cores on a chip. That was an innovation in hardware architecture that allowed them, for a time, to improve efficiency by going to software that could run on multiple cores, so you could multiprocess various activities. So that’s one way you can improve things. You can also work on the software — you can improve the efficiency of the software, you can improve the algorithms that you use.
So even if Moore's law shrinkage of transistors stops, which it hasn’t fully stopped. But even if it did, there are a lot of other things we can do. And AI in particular is relatively new. Basically, people threw a whole bunch of money at existing processors because there was this rush to deploy technology. But now, everyone’s stepping back and saying, well, look at the cost of the energy cost and the infrastructure cost. Is there a way to do this better? And sure, there definitely is, and Nvidia proved it in their presentation that you referred to.
This episode of Shift Key is sponsored by…
KORE Power provides the commercial, industrial, and utility markets with functional solutions that advance the clean energy transition worldwide. KORE Power's technology and manufacturing capabilities provide direct access to next generation battery cells, energy storage systems that scale to grid+, EV power & infrastructure, and intuitive asset management to unlock energy strategies across a myriad of applications. Explore more at korepower.com.
Watershed's climate data engine helps companies measure and reduce their emissions, turning the data they already have into an audit-ready carbon footprint backed by the latest climate science. Get the sustainability data you need in weeks, not months. Learn more at watershed.com.
Music for Shift Key is by Adam Kromelow.
© 2024 Heatmap News Inc. All Rights Reserved. Use of this site constitutes acceptance of our Terms of Service, Privacy Policy
(Your California Privacy Rights) and Do Not Sell My Personal Information.
Heatmap News Inc. may receive compensation for some links to products and services on this website. Offers may be subject to change without notice.