You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
Think of all the stuff you use electricity for that you didn't 20 or 25 years ago — all those devices, maybe even your car — and yet electricity use has barely budged this century. In 2000, the country used about 4 million gigawatt-hours of electricity, according to the International Energy Agency; in 2022, it used about 4.5 million GWh, a growth rate of about 0.5%.
In some ways, the purpose of current U.S. climate policy is to reverse this trend. Only about a fifth of all energy produced in the United States is electrical. Removing carbon emissions from transportation, heating and industry will require first converting all of those industries from running on combusted hydrocarbons to running on electricity — while at the same time, of course, working to make electricity generation carbon-free.
All that is to say, we’re definitely going to be using more electricity. Today, if you ask any utility, electricity market organization, or anyone working on energy generation and transmission, they’ll tell you we’re in for an era of load growth.
“For a long period of time, we could balance out additional demand with efficiency improvements,” Xan Fishman, energy policy director at the Bipartisan Policy Center, told me. “Recent forecast are showing we’re going to need a lot more electricity.”
When GridStrategies LLC looked at documents grid planners filed with federal regulators, it found that their aggregate five-year load growth forecasts had gone up from 2.6% in 2022 to 4.7% last year, while their forecast for peak demand, i.e. the maximum amount grids plan on having to be able to provide, had shot up by 18 GW. That’s the equivalent of about 35 gas-fired power plants running on full blast.
In New England, for example, ISO-NE is forecasting 2.4% annual growth over the next 10 years, while its winter peak demand will grow by 3% per year thanks largely to electrifying transportation and heating; that, in turn, is largely thanks to aggressive decarbonization mandates in the region’s constituent states.
Not all of the demand growth we’re currently seeing comes from electrifying our existing energy consumption. New sources of demand are popping up all over the grid — which, especially where they’re generated by new industrial uses, shows how the Biden administration’s combined climate and industrial policy raises the bar for itself. As a result of domestic content requirements for tax subsidies and explicit subsidies for certain kinds of non-energy manufacturing (namely semiconductors), manufacturing construction has shot up in the past few years. And these new plants require huge amounts of electricity.
When PJM Interconnection, the 13-state East Coast and Midwest electricity market, was making its load forecast, it specifically called out Intel’s CHIPS Act-funded facility under construction outside Columbus, Ohio; the electrification of New Jersey ports funded by the Inflation Reduction Act; and planned data centers in Maryland and Virginia as notable examples of increased load generation. For AEP, the utility serving Columbus, the forecast peak summer load in 2030 has gone from about 23.5 GW to 26 GW, compared to around 21 GW in 2023. Dominion, the utility serving Virginia and the booming Loudon County datacenter complex, forecast annual load growth of around 5% over the next decade.
To get a sense of how tremendous that is, when the energy system researchers with Princeton University’s REPEAT project wanted to project how much electricity consumption would have to increase annually to reach net zero by 2050, it turned out to be “only” 2.4%. Virginia is planning load growth at twice that rate just to feed electrons to its data centers.
“When you’re talking about a data center or a three-shift, seven-day-a-week manufacturing process, that’s far less manageable” than, say, electric cars, David Porter, vice president of electrification and sustainable energy strategy at the Electric Power Research Institute, told me. EVs can be powered at specific times based on demand for electricity across the grid, or by a distributed energy resource like residential solar and batteries. To power energy-hungry manufacturing processes, though, requires the kind of consistency that only fossil fuels and nuclear (or naturally limited renewables like hydropower) have historically been able to provide.
There’s no better example of the tension between electrification and emission reductions than in Georgia, where the state’s main utility Georgia Power has said that its estimates for load growth between 2023 and 2031 had jumped up from less than 400 megawatts to 6,600, a 17-times increase. The utility attributed this forecasting hike to “rapid economic expansion and an unprecedented increase in the demand for energy to the state,” including electric vehicle and battery manufacturing facilities, which the Biden administration has done so much to boost demand for and encourage their construction in the United States.
The utility also said that to serve this load growth, it would have to add new renewable resources, acquire power from other utilities and generators, and build new gas power plants, which immediately raised the ire and suspicion of green groups. The Sierra Club described the request as “shocking.”
But proponents of climate action shouldn’t necessarily despair at this new load, Fishman told me. “It’s really easy to decarbonize if you stop building stuff,” he said. “But [Americans] would likely keep buying stuff, and that stuff would be built elsewhere, quite likely with greater emissions intensity.”
In other words, “a resurgence of American manufacturing might lead to more U.S. emissions than in a scenario where we aren’t increasing our manufacturing base,” Fishman told me, but it’s “highly likely to reduce global emissions.” That’s because even now, U.S. electricity is cleaner than electricity in, for example, China, which is still heavily reliant on coal. (According to the IEA, 63% of China's electricity comes from coal burning, compared to 20% in the United States.)
Data centers, meanwhile, are expected to account for 6% of total electricity demand in the U.S. by 2026, according to the IEA, up from about 4% in 2022. And the AI ones will eat up even more: A ChatGPT query is about nine times as energy intensive as a Google search, according to the IEA. If generative artificial intelligence grows at anywhere near the rate that its proponents expect, it will lead to hefty increases in electricity demand, both from manufacturing the chips needed to power the systems and the electricity to power them. One example is Silicon Valley Power, a utility serving, well, Silicon Valley, which forecast load to double by 2035, “primarily” due to data centers’ demand for electricity.
But there may be some reason for skepticism about these load growth projections from data centers, Jon Koomey, a veteran information technology and energy researcher, told me. The particularly energy intensive large language models may not win out as a business, which would slow the growth in data center electricity demand, he said. And even if data centers continue to grow, they could also get far more efficient in how they use electricity — and might just end up using less than what they ask for from utilities.
“You don’t want to get caught short,” Koomey said, explaining why requests for power will be biased on the high end. “There’s an incentive for everyone to request more.”
But still, it’s no surprise that the companies at the heart of the data center boom — Google, Microsoft, and OpenAI — have shown an interest in finding ways to match that constant electricity demand with non-carbon-emitting power. Their facilities need to be powered 24/7, which existing renewable sources largely struggle to provide. (It’s neither windy nor sunny 100% of the time.) This has led to a flurry of investment and dealmaking by these companies to develop and procure “clean firm” resources. Google has a deal with Fervo, the enhanced geothermal startup, to purchase power generated by its operation in Nevada, while Microsoft signed an agreement with Constellation to purchase nuclear-generated electricity for its Virginia data centers to complement its existing renewable power. Silicon Valley Power also said in its planning documents that it’s looking to acquire more geothermal resources. And OpenAI’s Sam Altman has invested in a fusion company.
“If we want to grow our manufacturing base we need the energy to make that work, we need to get that energy to those new manufacturing plants,” Fishman said. “It would be bad if we had a bunch of companies who said, ‘We want to build a factory,’ and can’t because they don’t get enough electricity.”
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Instead of rocket fuel, they’re burning biomass.
Arbor Energy might have the flashiest origin story in cleantech.
After the company’s CEO, Brad Hartwig, left SpaceX in 2018, he attempted to craft the ideal resume for a future astronaut, his dream career. He joined the California Air National Guard, worked as a test pilot at the now-defunct electric aviation startup Kitty Hawk, and participated in volunteer search and rescue missions in the Bay Area, which gave him a front row seat to the devastating effects of wildfires in Northern California.
That experience changed everything. “I decided I actually really like planet Earth,” Hartwig told me, “and I wanted to focus my career instead on preserving it, rather than trying to leave it.” So he rallied a bunch of his former rocket engineer colleagues to repurpose technology they pioneered at SpaceX to build a biomass-fueled, carbon negative power source that’s supposedly about ten times smaller, twice as efficient, and eventually, one-third the cost of the industry standard for this type of plant.
Take that, all you founders humble-bragging about starting in a dingy garage.
“It’s not new science, per se,” Hartwig told me. The goal of this type of tech, called bioenergy with carbon capture and storage, is to combine biomass-based energy generation with carbon dioxide removal to achieve net negative emissions. Sounds like a dream, but actually producing power or heat from this process has so far proven too expensive to really make sense. There are only a few so-called BECCS facilities operating in the U.S. today, and they’re all just ethanol fuel refineries with carbon capture and storage technology tacked on.
But the advances in 3D printing and computer modeling that allowed the SpaceX team to build an increasingly simple and cheap rocket engine have allowed Arbor to move quickly into this new market, Hartwig explained. “A lot of the technology that we had really pioneered over the last decade — in reactor design, combustion devices, turbo machinery, all for rocket propulsion — all that technology has really quite immediate application in this space of biomass conversion and power generation.”
Arbor’s method is poised to be a whole lot sleeker and cheaper than the BECCS plants of today, enabling both more carbon sequestration and actual electricity production, all by utilizing what Hartwig fondly refers to as a “vegetarian rocket engine.” Because there’s no air in space, astronauts have to bring pure oxygen onboard, which the rocket engines use to burn fuel and propel themselves into the stratosphere and beyond. Arbor simply subs out the rocket fuel for biomass. When that biomass is combusted with pure oxygen, the resulting exhaust consists of just CO2 and water. As the exhaust cools, the water condenses out, and what’s left is a stream of pure carbon dioxide that’s ready to be injected deep underground for permanent storage. All of the energy required to operate Arbor’s system is generated by the biomass combustion itself.
“Arbor is the first to bring forward a technology that can provide clean baseload energy in a very compact form,” Clea Kolster, a partner and Head of Science at Lowercarbon Capital told me. Lowercarbon is an investor in Arbor, alongside other climate tech-focused venture capital firms including Gigascale Capital and Voyager Ventures, but the company has not yet disclosed how much it’s raised.
Last month, Arbor signed a deal with Microsoft to deliver 25,000 tons of permanent carbon dioxide removal to the tech giant starting in 2027, when the startup’s first commercial project is expected to come online. As a part of the deal, Arbor will also generate 5 megawatts of clean electricity per year, enough to power about 4,000 U.S. homes. And just a few days ago, the Department of Energy announced that Arbor is one of 11 projects to receive a combined total of $58.5 million to help develop the domestic carbon removal industry.
Arbor’s current plan is to source biomass from forestry waste, much of which is generated by forest thinning operations intended to prevent destructive wildfires. Hartwig told me that for every ton of organic waste, Arbor can produce about one megawatt hour of electricity, which is in line with current efficiency standards, plus about 1.8 tons of carbon removal. “We look at being as efficient, if not a little more efficient than a traditional bioenergy power plant that does not have carbon capture on it,” he explained.
The company’s carbon removal price targets are also extremely competitive — in the $50 to $100 per ton range, Hartwig said. Compare that to something like direct air capture, which today exceeds $600 per ton, or enhanced rock weathering, which is usually upwards of $300 per ton. “The power and carbon removal they can offer comes at prices that meet nearly unlimited demand,”Mike Schroepfer, the founder of Gigascale Capital and former CTO of Meta, told me via email. Arbor benefits from the fact that the electricity it produces and sells can help offset the cost of the carbon removal, and vice versa. So if the company succeeds in hitting its cost and efficiency targets, Hartwig said, this “quickly becomes a case for, why wouldn’t you just deploy these everywhere?”
Initial customers will likely be (no surprise here) the Microsofts, Googles and Metas of the world — hyperscalers with growing data center needs and ambitious emissions targets. “What Arbor unlocks is basically the ability for hyperscalers to stop needing to sacrifice their net zero goals for AI,” Kolster told me. And instead of languishing in the interminable grid interconnection queue, Hartwig said that providing power directly to customers could ensure rapid, early deployment. “We see it as being quicker to power behind-the-meter applications, because you don’t have to go through the process of connecting to the grid,” he told me. Long-term though, he said grid connection will be vital, since Arbor can provide baseload power whereas intermittent renewables cannot.
All of this could serve as a much cheaper alternative, to say, re-opening shuttered nuclear facilities, as Microsoft also recently committed to doing at Three Mile Island. “It’s great, we should be doing that,” Kolster said of this nuclear deal, “but there’s actually a limited pool of options to do that, and unfortunately, there is still community pushback.”
Currently, Arbor is working to build out its pilot plant in San Bernardino, California, which Hartwig told me will turn on this December. And by 2030, the company plans to have its first commercial plant operating at scale, generating 100 megawatts of electricity while removing nearly 2 megatons of CO2 every year. “To put it in perspective: In 2023, the U.S. added roughly 9 gigawatts of gas power to the grid, which generates 18 to 23 megatons of CO2 a year,” Schroepfer wrote to me. So having just one Arbor facility removing 2 megatons would make a real dent. The first plant will be located in Louisiana, where Arbor will also be working with an as-yet-unnamed partner to do the carbon storage.
The company’s carbon credits will be verified with the credit certification platform Isometric, which is also backed by Lowercarbon and thought to have the most stringent standards in the industry. Hartwig told me that Arbor worked hand-in-hand with Isometric to develop the protocol for “biogenic carbon capture and storage,” as the company is the first Isometric-approved supplier to use this standard.
But Hartwig also said that government support hasn’t yet caught up to the tech’s potential. While the Inflation Reduction Act provides direct air capture companies with $180 per ton of carbon dioxide removed, technology such as Arbor’s only qualifies for $85 per ton. It’s not nothing — more than the zero dollars enhanced rock weathering companies such as Lithos or bio-oil sequestration companies such as Charm are getting. “But at the same time, we’re treated the same as if we’re sequestering CO2 emissions from a natural gas plant or a coal plant,” Hartwig told me, as opposed to getting paid for actual CO2 removal.
“I think we are definitely going to need government procurement or involvement to actually hit one, five, 10 gigatons per year of carbon removal,” Hartwig said. Globally, scientists estimate that we’ll need up to 10 gigatons of annual CO2 removal by 2050 in order to limit global warming to 1.5 degrees Celsius. “Even at $100 per ton, 10 gigatons of carbon removal is still a pretty hefty price tag,” Hartwig told me. A $1 trillion price tag, to be exact. “We definitely need more players than just Microsoft.”
New research out today shows a 10-fold increase in smoke mortality related to climate change from the 1960s to the 2010.
If you are one of the more than 2 billion people on Earth who have inhaled wildfire smoke, then you know firsthand that it is nasty stuff. It makes your eyes sting and your throat sore and raw; breathe in smoke for long enough, and you might get a headache or start to wheeze. Maybe you’ll have an asthma attack and end up in the emergency room. Or maybe, in the days or weeks afterward, you’ll suffer from a stroke or heart attack that you wouldn’t have had otherwise.
Researchers are increasingly convinced that the tiny, inhalable particulate matter in wildfire smoke, known as PM2.5, contributes to thousands of excess deaths annually in the United States alone. But is it fair to link those deaths directly to climate change?
A new study published Monday in Nature Climate Change suggests that for a growing number of cases, the answer should be yes. Chae Yeon Park, a climate risk modeling researcher at Japan’s National Institute for Environmental Studies, looked with her colleagues at three fire-vegetation models to understand how hazardous emissions changed from 1960 to 2019, compared to a hypothetical control model that excluded historical climate change data. They found that while fewer than 669 deaths in the 1960s could be attributed to climate change globally, that number ballooned to 12,566 in the 2010s — roughly a 20-fold increase. The proportion of all global PM2.5 deaths attributable to climate change jumped 10-fold over the same period, from 1.2% in the 1960s to 12.8% in the 2010s.
“It’s a timely and meaningful study that informs the public and the government about the dangers of wildfire smoke and how climate change is contributing to that,” Yiqun Ma, who researches the intersection of climate change, air pollution, and human health at the Yale School of Medicine, and who was not involved in the Nature study, told me.
The study found the highest climate change-attributable fire mortality values in South America, Australia, and Europe, where increases in heat and decreases in humidity were also the greatest. In the southern hemisphere of South America, for example, the authors wrote that fire mortalities attributable to climate change increased from a model average of 35% to 71% between the 1960s and 2010s, “coinciding with decreased relative humidity,” which dries out fire fuels. For the same reason, an increase in relative humidity lowered fire mortality in other regions, such as South Asia. North America exhibited a less dramatic leap in climate-related smoke mortalities, with climate change’s contribution around 3.6% in the 1960s, “with a notable rise in the 2010s” to 18.8%, Park told me in an email.
While that’s alarming all on its own, Ma told me there was a possibility that Park’s findings might actually be too conservative. “They assume PM2.5 from wildfire sources and from other sources” — like from cars or power plants — “have the same toxicity,” she explained. “But in fact, in recent studies, people have found PM2.5 from fire sources can be more toxic than those from an urban background.” Another reason Ma suspected the study’s numbers might be an underestimate was because the researchers focused on only six diseases that have known links to PM2.5 exposure: chronic obstructive pulmonary disease, lung cancer, coronary heart disease, type 2 diabetes, stroke, and lower respiratory infection. “According to our previous findings [at the Yale School of Medicine], other diseases can also be influenced by wildfire smoke, such as mental disorders, depression, and anxiety, and they did not consider that part,” she told me.
Minghao Qiu, an assistant professor at Stony Brook University and one of the country’s leading researchers on wildfire smoke exposure and climate change, generally agreed with Park’s findings, but cautioned that there is “a lot of uncertainty in the underlying numbers” in part because, intrinsically, wildfire smoke exposure is such a complicated thing to try to put firm numbers to. “It’s so difficult to model how climate influences wildfire because wildfire is such an idiosyncratic process and it’s so random, ” he told me, adding, “In general, models are not great in terms of capturing wildfire.”
Despite their few reservations, both Qiu and Ma emphasized the importance of studies like Park’s. “There are no really good solutions” to reduce wildfire PM2.5 exposure. You can’t just “put a filter on a stack” as you (sort of) can with power plant emissions, Qiu pointed out.
Even prescribed fires, often touted as an important wildfire mitigation technique, still produce smoke. Park’s team acknowledged that a whole suite of options would be needed to minimize future wildfire deaths, ranging from fire-resilient forest and urban planning to PM2.5 treatment advances in hospitals. And, of course, there is addressing the root cause of the increased mortality to begin with: our warming climate.
“To respond to these long-term changes,” Park told me, “it is crucial to gradually modify our system.”
On the COP16 biodiversity summit, Big Oil’s big plan, and sea level rise
Current conditions: Record rainfall triggered flooding in Roswell, New Mexico, that killed at least two people • Storm Ashley unleashed 80 mph winds across parts of the U.K. • A wildfire that broke out near Oakland, California, on Friday is now 85% contained.
Forecasters hadn’t expected Hurricane Oscar to develop into a hurricane at all, let alone in just 12 hours. But it did. The Category 1 storm made landfall in Cuba on Sunday, hours after passing over the Bahamas, bringing intense rain and strong winds. Up to a foot of rainfall was expected. Oscar struck while Cuba was struggling to recover from a large blackout that has left millions without power for four days. A second system, Tropical Storm Nadine, made landfall in Belize on Saturday with 60 mph winds and then quickly weakened. Both Oscar and Nadine developed in the Atlantic on the same day.
Hurricane OscarAccuWeather
The COP16 biodiversity summit starts today in Cali, Colombia. Diplomats from 190 countries will try to come up with a plan to halt global biodiversity loss, aiming to protect 30% of land and sea areas and restore 30% of degraded ecosystems by 2030. Discussions will revolve around how to monitor nature degradation, hold countries accountable for their protection pledges, and pay for biodiversity efforts. There will also be a big push to get many more countries to publish national biodiversity strategies. “This COP is a test of how serious countries are about upholding their international commitments to stop the rapid loss of biodiversity,” said Crystal Davis, Global Director of Food, Land, and Water at the World Resources Institute. “The world has no shot at doing so without richer countries providing more financial support to developing countries — which contain most of the world’s biodiversity.”
A prominent group of oil and gas producers has developed a plan to roll back environmental rules put in place by President Biden, The Washington Post reported. The paper got its hands on confidential documents from the American Exploration and Production Council (AXPC), which represents some 30 producers. The documents include draft executive orders promoting fossil fuel production for a newly-elected President Trump to sign if he takes the White House in November, as well as a roadmap for dismantling many policies aimed at getting oil and gas producers to disclose and curb emissions. AXPC’s members, including ExxonMobil, ConocoPhillips, and Hess, account for about half of the oil and gas produced in the U.S., the Post reported.
A new report from the energy think tank Ember looks at how the uptake of electric vehicles and heat pumps in the U.K. is affecting oil and gas consumption. It found that last year the country had 1.5 million EVs on the road, and 430,000 residential heat pumps in homes, and the reduction in fossil fuel use due to the growth of these technologies was equivalent to 14 million barrels of oil, or about what the U.K. imports over a two-week span. This reduction effect will be even stronger as more and more EVs and heat pumps are powered by clean energy. The report also found that even though power demand is expected to rise, efficiency gains from electrification and decarbonization will make up for this, leading to an overall decline in energy use and fossil fuel consumption.
Ember
The world’s sea levels are projected to rise by more than 6 inches on average over the next 30 years if current trends continue, according to a new study published in the journal Nature. “Such rates would represent an evolving challenge for adaptation efforts,” the authors wrote. By examining satellite data, the researchers found that sea levels have risen by about .4 inches since 1993, and that they’re rising faster now than they were then. In 1993 the seas were rising by about .08 inches per year, and last year they were rising at .17 inches per year. These are averages, of course, and some areas are seeing much more extreme changes. For example, areas around Miami, Florida, have already seen sea levels rise by 6 inches over the last 31 years.
“As the climate crisis grows more urgent, restoring faith in government will be more important than ever.” –Paul Waldman writing for Heatmap about the profound implications of America becoming a low-trust society.