You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
Ask any climate wonk what’s holding back clean energy in the U.S. and you’re likely to get the same answer — not enough power lines. But what if the problem isn’t the number of power lines, but rather the outdated metal wires they’re made of?
Restringing transmission lines with more advanced wires, a process known as “reconductoring,” has the potential to double the amount of electricity our existing transmission system can handle, for less than half the price of building new lines. That’s the main finding of a recently published working paper from researchers at the University of California, Berkeley, and Gridlab, an energy consulting firm.
There are a few reasons that something as boring and seemingly ubiquitous as power lines are so crucial to the energy transition. Electrifying our cars and homes will increase demand for electricity, and much of the system is already too congested to integrate new wind and solar power plants. Plus, there just aren’t enough lines that run from the sunniest, windiest places to the places where most people actually live.
To realize the emission reduction potential of the clean energy subsidies in the Inflation Reduction Act, we have to more than double the rate of transmission expansion, according to research from Princeton University’s Repeat Project. Clean energy projects already face major delays and are often hit with exorbitant bills to connect to the grid. A study from Lawrence Berkeley National Laboratory called “Queued Up” found that at the end of 2022, there were more than 10,000 power plant and energy storage projects waiting for permission to connect to the grid — enough to double electricity production in the country. Some 95% of them were zero-carbon resources.
The main problem is permitting. Establishing rights-of-way for new power lines requires extensive environmental review and invites vicious local opposition. People don’t want to look at more wires strung across the landscape. They worry the eyesore will decrease their property value, or that the construction will hurt local ecosystems. New power lines often take upwards of 10 years to plan, permit, and build.
But it’s possible to avoid this time-consuming process, at least in many cases, by simply reconductoring lines along existing rights-of-way. Most of our existing power lines have a steel core surrounded by strands of aluminum. Advanced conductors replace the steel with a lighter but stronger core made of a composite material, such as carbon fiber. This subtle shift in materials and design enables the line to operate at higher temperatures, with less sag, significantly increasing the amount of power it can carry.
Advanced conductors cost two to four times more than conventional power lines — but upgrading an existing line to use advanced conductors can be less than half what a new power line would cost because it eliminates much of the construction spending and fees from permitting for new rights-of-way, the Berkeley study found.
“The most compelling, exciting thing is that it only requires a maintenance permit,” Duncan Callaway, an associate professor of energy and resources at Berkeley and one of the authors said while presenting the research over Zoom last week.
The paper highlights a 2016 project in southeastern Texas. Due to rapid population growth in the area, the local utility, American Electric Power, was seeing higher demand for electricity at peak times than it was prepared for, leading to blackouts. It needed to come up with a solution, fast, and decided that reconductoring 240 miles of its transmission lines would take less time than permitting new ones. The project ended up finishing ahead of schedule and under budget, at a cost of $900,000 per mile. By comparison, the 3,600 miles of new lines built under Texas’ Competitive Renewable Energy Zone program, which were built to connect wind-rich areas to population centers, cost more than double, at an average of $1.9 million per mile.
Callaway and his co-authors also plugged their findings into a power system expansion model — basically a computer program that maps out the most cost-effective mix of technologies to meet regional electric power demand. They fed the model a scenario where the only option for transmission was to build new lines at their slow, historical rate, as well as a scenario where there was also an option to reconductor along existing rights-of-way. The second scenario resulted in nearly four times as much transmission capacity by 2035, enabling the country to achieve a more than 90% clean electric grid by that date.
There are cases where new power lines are needed — for example, to establish a new route to access a high-quality renewable resource, Emilia Chojkiewicz, another author of the study, told me in an email. But she said it nearly always makes sense to consider reconductoring given the potential to double capacity and do so much more quickly. “Unfortunately,” she added, “current transmission planning practices do not tend to incentivize or even consider reconductoring.”
This all seems so ridiculously easy that it begs the question: Why aren’t utilities already rushing to do it? During the webinar last week, Chojkiewicz and her co-authors said part of the problem is just a lack of awareness and comfort with the technology. But the bigger issue is that utilities are not incentivized to look for cheaper, more efficient solutions like reconductoring because they profit off capital spending.
To change this, they suggested that the Federal Energy Regulatory Commission, which oversees interstate transmission, and state public service commissions, which regulate utilities at the state level, mandate the consideration of reconductoring in transmission and resource planning processes, and to properly value the benefits that advanced conductors provide. The Department of Energy could also consider instituting a national conductor efficiency standard, so that all new wires installed, whether along existing rights-of-way or new routes, achieve a minimum level of performance.
Reconductoring isn’t the only no-brainer alternative to building new power lines. Another study from the clean energy think tank RMI published last week illustrates the opportunity with even cheaper tweaks called “grid enhancing technologies.” One option is to install sensors that collect data on wind speed, temperature, and other factors that affect power lines in real time, called dynamic line ratings. These sensors allow utilities to safely increase the amount of power transmitted when weather conditions permit it. There are also power flow controls that can redirect power away from congested lines so that it can be transmitted elsewhere rather than wasted.
RMI found that in the PJM interconnection — a section of the grid in the eastern U.S. that is so congested the grid operator has frozen new applications to connect to it — these grid enhancing technologies could open up more than 6 gigawatts of new capacity to wind, solar, and storage projects in just three years. For reference, in 2022, nearly 300 gigawatts-worth of energy projects were waiting for permission to connect in PJM at the end 2022.
The cost savings are not just theoretical. In 2018, the PJM grid operator determined that a wind farm expansion in Illinois was going to require $100 million of grid upgrades — including building new lines and reconductoring existing ones — over a timeline of about three years before it would be able to connect. The developer countered that the needed upgrades could be achieved through power flow controls, which could be installed for a cost of just $12 million in less than half the time. PJM approved the idea, and the project is currently underway.
Congress is still debating how to reform permitting processes. But while that’s still a necessary step, it’s becoming increasingly clear that there’s a host of other outside-the-box solutions that can be deployed more quickly, in the near term. The IRA may have convinced the environmental movement that building new stuff was worth it, but there are still a lot of cases where the smarter choice is to renovate.
Editor’s note: This story has been updated to correct the cost of adding power flow controls to the PJM interconnection.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Instead of rocket fuel, they’re burning biomass.
Arbor Energy might have the flashiest origin story in cleantech.
After the company’s CEO, Brad Hartwig, left SpaceX in 2018, he attempted to craft the ideal resume for a future astronaut, his dream career. He joined the California Air National Guard, worked as a test pilot at the now-defunct electric aviation startup Kitty Hawk, and participated in volunteer search and rescue missions in the Bay Area, which gave him a front row seat to the devastating effects of wildfires in Northern California.
That experience changed everything. “I decided I actually really like planet Earth,” Hartwig told me, “and I wanted to focus my career instead on preserving it, rather than trying to leave it.” So he rallied a bunch of his former rocket engineer colleagues to repurpose technology they pioneered at SpaceX to build a biomass-fueled, carbon negative power source that’s supposedly about ten times smaller, twice as efficient, and eventually, one-third the cost of the industry standard for this type of plant.
Take that, all you founders humble-bragging about starting in a dingy garage.
“It’s not new science, per se,” Hartwig told me. The goal of this type of tech, called bioenergy with carbon capture and storage, is to combine biomass-based energy generation with carbon dioxide removal to achieve net negative emissions. Sounds like a dream, but actually producing power or heat from this process has so far proven too expensive to really make sense. There are only a few so-called BECCS facilities operating in the U.S. today, and they’re all just ethanol fuel refineries with carbon capture and storage technology tacked on.
But the advances in 3D printing and computer modeling that allowed the SpaceX team to build an increasingly simple and cheap rocket engine have allowed Arbor to move quickly into this new market, Hartwig explained. “A lot of the technology that we had really pioneered over the last decade — in reactor design, combustion devices, turbo machinery, all for rocket propulsion — all that technology has really quite immediate application in this space of biomass conversion and power generation.”
Arbor’s method is poised to be a whole lot sleeker and cheaper than the BECCS plants of today, enabling both more carbon sequestration and actual electricity production, all by utilizing what Hartwig fondly refers to as a “vegetarian rocket engine.” Because there’s no air in space, astronauts have to bring pure oxygen onboard, which the rocket engines use to burn fuel and propel themselves into the stratosphere and beyond. Arbor simply subs out the rocket fuel for biomass. When that biomass is combusted with pure oxygen, the resulting exhaust consists of just CO2 and water. As the exhaust cools, the water condenses out, and what’s left is a stream of pure carbon dioxide that’s ready to be injected deep underground for permanent storage. All of the energy required to operate Arbor’s system is generated by the biomass combustion itself.
“Arbor is the first to bring forward a technology that can provide clean baseload energy in a very compact form,” Clea Kolster, a partner and Head of Science at Lowercarbon Capital told me. Lowercarbon is an investor in Arbor, alongside other climate tech-focused venture capital firms including Gigascale Capital and Voyager Ventures, but the company has not yet disclosed how much it’s raised.
Last month, Arbor signed a deal with Microsoft to deliver 25,000 tons of permanent carbon dioxide removal to the tech giant starting in 2027, when the startup’s first commercial project is expected to come online. As a part of the deal, Arbor will also generate 5 megawatts of clean electricity per year, enough to power about 4,000 U.S. homes. And just a few days ago, the Department of Energy announced that Arbor is one of 11 projects to receive a combined total of $58.5 million to help develop the domestic carbon removal industry.
Arbor’s current plan is to source biomass from forestry waste, much of which is generated by forest thinning operations intended to prevent destructive wildfires. Hartwig told me that for every ton of organic waste, Arbor can produce about one megawatt hour of electricity, which is in line with current efficiency standards, plus about 1.8 tons of carbon removal. “We look at being as efficient, if not a little more efficient than a traditional bioenergy power plant that does not have carbon capture on it,” he explained.
The company’s carbon removal price targets are also extremely competitive — in the $50 to $100 per ton range, Hartwig said. Compare that to something like direct air capture, which today exceeds $600 per ton, or enhanced rock weathering, which is usually upwards of $300 per ton. “The power and carbon removal they can offer comes at prices that meet nearly unlimited demand,”Mike Schroepfer, the founder of Gigascale Capital and former CTO of Meta, told me via email. Arbor benefits from the fact that the electricity it produces and sells can help offset the cost of the carbon removal, and vice versa. So if the company succeeds in hitting its cost and efficiency targets, Hartwig said, this “quickly becomes a case for, why wouldn’t you just deploy these everywhere?”
Initial customers will likely be (no surprise here) the Microsofts, Googles and Metas of the world — hyperscalers with growing data center needs and ambitious emissions targets. “What Arbor unlocks is basically the ability for hyperscalers to stop needing to sacrifice their net zero goals for AI,” Kolster told me. And instead of languishing in the interminable grid interconnection queue, Hartwig said that providing power directly to customers could ensure rapid, early deployment. “We see it as being quicker to power behind-the-meter applications, because you don’t have to go through the process of connecting to the grid,” he told me. Long-term though, he said grid connection will be vital, since Arbor can provide baseload power whereas intermittent renewables cannot.
All of this could serve as a much cheaper alternative, to say, re-opening shuttered nuclear facilities, as Microsoft also recently committed to doing at Three Mile Island. “It’s great, we should be doing that,” Kolster said of this nuclear deal, “but there’s actually a limited pool of options to do that, and unfortunately, there is still community pushback.”
Currently, Arbor is working to build out its pilot plant in San Bernardino, California, which Hartwig told me will turn on this December. And by 2030, the company plans to have its first commercial plant operating at scale, generating 100 megawatts of electricity while removing nearly 2 megatons of CO2 every year. “To put it in perspective: In 2023, the U.S. added roughly 9 gigawatts of gas power to the grid, which generates 18 to 23 megatons of CO2 a year,” Schroepfer wrote to me. So having just one Arbor facility removing 2 megatons would make a real dent. The first plant will be located in Louisiana, where Arbor will also be working with an as-yet-unnamed partner to do the carbon storage.
The company’s carbon credits will be verified with the credit certification platform Isometric, which is also backed by Lowercarbon and thought to have the most stringent standards in the industry. Hartwig told me that Arbor worked hand-in-hand with Isometric to develop the protocol for “biogenic carbon capture and storage,” as the company is the first Isometric-approved supplier to use this standard.
But Hartwig also said that government support hasn’t yet caught up to the tech’s potential. While the Inflation Reduction Act provides direct air capture companies with $180 per ton of carbon dioxide removed, technology such as Arbor’s only qualifies for $85 per ton. It’s not nothing — more than the zero dollars enhanced rock weathering companies such as Lithos or bio-oil sequestration companies such as Charm are getting. “But at the same time, we’re treated the same as if we’re sequestering CO2 emissions from a natural gas plant or a coal plant,” Hartwig told me, as opposed to getting paid for actual CO2 removal.
“I think we are definitely going to need government procurement or involvement to actually hit one, five, 10 gigatons per year of carbon removal,” Hartwig said. Globally, scientists estimate that we’ll need up to 10 gigatons of annual CO2 removal by 2050 in order to limit global warming to 1.5 degrees Celsius. “Even at $100 per ton, 10 gigatons of carbon removal is still a pretty hefty price tag,” Hartwig told me. A $1 trillion price tag, to be exact. “We definitely need more players than just Microsoft.”
New research out today shows a 10-fold increase in smoke mortality related to climate change from the 1960s to the 2010.
If you are one of the more than 2 billion people on Earth who have inhaled wildfire smoke, then you know firsthand that it is nasty stuff. It makes your eyes sting and your throat sore and raw; breathe in smoke for long enough, and you might get a headache or start to wheeze. Maybe you’ll have an asthma attack and end up in the emergency room. Or maybe, in the days or weeks afterward, you’ll suffer from a stroke or heart attack that you wouldn’t have had otherwise.
Researchers are increasingly convinced that the tiny, inhalable particulate matter in wildfire smoke, known as PM2.5, contributes to thousands of excess deaths annually in the United States alone. But is it fair to link those deaths directly to climate change?
A new study published Monday in Nature Climate Change suggests that for a growing number of cases, the answer should be yes. Chae Yeon Park, a climate risk modeling researcher at Japan’s National Institute for Environmental Studies, looked with her colleagues at three fire-vegetation models to understand how hazardous emissions changed from 1960 to 2019, compared to a hypothetical control model that excluded historical climate change data. They found that while fewer than 669 deaths in the 1960s could be attributed to climate change globally, that number ballooned to 12,566 in the 2010s — roughly a 20-fold increase. The proportion of all global PM2.5 deaths attributable to climate change jumped 10-fold over the same period, from 1.2% in the 1960s to 12.8% in the 2010s.
“It’s a timely and meaningful study that informs the public and the government about the dangers of wildfire smoke and how climate change is contributing to that,” Yiqun Ma, who researches the intersection of climate change, air pollution, and human health at the Yale School of Medicine, and who was not involved in the Nature study, told me.
The study found the highest climate change-attributable fire mortality values in South America, Australia, and Europe, where increases in heat and decreases in humidity were also the greatest. In the southern hemisphere of South America, for example, the authors wrote that fire mortalities attributable to climate change increased from a model average of 35% to 71% between the 1960s and 2010s, “coinciding with decreased relative humidity,” which dries out fire fuels. For the same reason, an increase in relative humidity lowered fire mortality in other regions, such as South Asia. North America exhibited a less dramatic leap in climate-related smoke mortalities, with climate change’s contribution around 3.6% in the 1960s, “with a notable rise in the 2010s” to 18.8%, Park told me in an email.
While that’s alarming all on its own, Ma told me there was a possibility that Park’s findings might actually be too conservative. “They assume PM2.5 from wildfire sources and from other sources” — like from cars or power plants — “have the same toxicity,” she explained. “But in fact, in recent studies, people have found PM2.5 from fire sources can be more toxic than those from an urban background.” Another reason Ma suspected the study’s numbers might be an underestimate was because the researchers focused on only six diseases that have known links to PM2.5 exposure: chronic obstructive pulmonary disease, lung cancer, coronary heart disease, type 2 diabetes, stroke, and lower respiratory infection. “According to our previous findings [at the Yale School of Medicine], other diseases can also be influenced by wildfire smoke, such as mental disorders, depression, and anxiety, and they did not consider that part,” she told me.
Minghao Qiu, an assistant professor at Stony Brook University and one of the country’s leading researchers on wildfire smoke exposure and climate change, generally agreed with Park’s findings, but cautioned that there is “a lot of uncertainty in the underlying numbers” in part because, intrinsically, wildfire smoke exposure is such a complicated thing to try to put firm numbers to. “It’s so difficult to model how climate influences wildfire because wildfire is such an idiosyncratic process and it’s so random, ” he told me, adding, “In general, models are not great in terms of capturing wildfire.”
Despite their few reservations, both Qiu and Ma emphasized the importance of studies like Park’s. “There are no really good solutions” to reduce wildfire PM2.5 exposure. You can’t just “put a filter on a stack” as you (sort of) can with power plant emissions, Qiu pointed out.
Even prescribed fires, often touted as an important wildfire mitigation technique, still produce smoke. Park’s team acknowledged that a whole suite of options would be needed to minimize future wildfire deaths, ranging from fire-resilient forest and urban planning to PM2.5 treatment advances in hospitals. And, of course, there is addressing the root cause of the increased mortality to begin with: our warming climate.
“To respond to these long-term changes,” Park told me, “it is crucial to gradually modify our system.”
On the COP16 biodiversity summit, Big Oil’s big plan, and sea level rise
Current conditions: Record rainfall triggered flooding in Roswell, New Mexico, that killed at least two people • Storm Ashley unleashed 80 mph winds across parts of the U.K. • A wildfire that broke out near Oakland, California, on Friday is now 85% contained.
Forecasters hadn’t expected Hurricane Oscar to develop into a hurricane at all, let alone in just 12 hours. But it did. The Category 1 storm made landfall in Cuba on Sunday, hours after passing over the Bahamas, bringing intense rain and strong winds. Up to a foot of rainfall was expected. Oscar struck while Cuba was struggling to recover from a large blackout that has left millions without power for four days. A second system, Tropical Storm Nadine, made landfall in Belize on Saturday with 60 mph winds and then quickly weakened. Both Oscar and Nadine developed in the Atlantic on the same day.
Hurricane OscarAccuWeather
The COP16 biodiversity summit starts today in Cali, Colombia. Diplomats from 190 countries will try to come up with a plan to halt global biodiversity loss, aiming to protect 30% of land and sea areas and restore 30% of degraded ecosystems by 2030. Discussions will revolve around how to monitor nature degradation, hold countries accountable for their protection pledges, and pay for biodiversity efforts. There will also be a big push to get many more countries to publish national biodiversity strategies. “This COP is a test of how serious countries are about upholding their international commitments to stop the rapid loss of biodiversity,” said Crystal Davis, Global Director of Food, Land, and Water at the World Resources Institute. “The world has no shot at doing so without richer countries providing more financial support to developing countries — which contain most of the world’s biodiversity.”
A prominent group of oil and gas producers has developed a plan to roll back environmental rules put in place by President Biden, The Washington Post reported. The paper got its hands on confidential documents from the American Exploration and Production Council (AXPC), which represents some 30 producers. The documents include draft executive orders promoting fossil fuel production for a newly-elected President Trump to sign if he takes the White House in November, as well as a roadmap for dismantling many policies aimed at getting oil and gas producers to disclose and curb emissions. AXPC’s members, including ExxonMobil, ConocoPhillips, and Hess, account for about half of the oil and gas produced in the U.S., the Post reported.
A new report from the energy think tank Ember looks at how the uptake of electric vehicles and heat pumps in the U.K. is affecting oil and gas consumption. It found that last year the country had 1.5 million EVs on the road, and 430,000 residential heat pumps in homes, and the reduction in fossil fuel use due to the growth of these technologies was equivalent to 14 million barrels of oil, or about what the U.K. imports over a two-week span. This reduction effect will be even stronger as more and more EVs and heat pumps are powered by clean energy. The report also found that even though power demand is expected to rise, efficiency gains from electrification and decarbonization will make up for this, leading to an overall decline in energy use and fossil fuel consumption.
Ember
The world’s sea levels are projected to rise by more than 6 inches on average over the next 30 years if current trends continue, according to a new study published in the journal Nature. “Such rates would represent an evolving challenge for adaptation efforts,” the authors wrote. By examining satellite data, the researchers found that sea levels have risen by about .4 inches since 1993, and that they’re rising faster now than they were then. In 1993 the seas were rising by about .08 inches per year, and last year they were rising at .17 inches per year. These are averages, of course, and some areas are seeing much more extreme changes. For example, areas around Miami, Florida, have already seen sea levels rise by 6 inches over the last 31 years.
“As the climate crisis grows more urgent, restoring faith in government will be more important than ever.” –Paul Waldman writing for Heatmap about the profound implications of America becoming a low-trust society.