You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
Is international cooperation or technological development the answer to an apocalyptic threat?
Christopher Nolan’s film Oppenheimer is about the great military contest of the Second World War, but only in the background. It’s really about a clash of visions for a postwar world defined by the physicist J. Robert Oppenheimer’s work at Los Alamos and beyond. The great power unleashed by the bombs at Hiroshima and Nagasaki could be dwarfed by what knowledge of nuclear physics could produce in the coming years, risking a war more horrifying than the one that had just concluded.
Oppenheimer, and many of his fellow atomic scientists, would spend much of the postwar period arguing for international cooperation, scientific openness, and nuclear restriction. But there was another cadre of scientists, exemplified by a former colleague turned rival, Edward Teller, that sought to answer the threat of nuclear annihilation with new technology — including even bigger bombs.
As the urgency of the nuclear question declined with the end of the Cold War, the scientific community took up a new threat to global civilization: climate change. While the conflict mapped out in Oppenheimer was over nuclear weapons, the clash of visions, which ended up burying Oppenheimer and elevating Teller, also maps out to the great debate over global warming: Should we reach international agreements to cooperatively reduce carbon emissions or should we throw our — and specifically America’s — great resources into a headlong rush of technological development? Should we massively overhaul our energy system or make the sun a little less bright?
Oppenheimer’s dream of international cooperation to prevent a nuclear arms race was born even before the Manhattan Project culminated with the Trinity test. Oppenheimer and Danish physicist Niels Bohr “believed that an agreement between the wartime allies based upon the sharing of information, including the existence of the Manhattan Project, could prevent the surfacing of a nuclear-armed world,” writes Marco Borghi in a Wilson Institute working paper.
Oppenheimer even suggested that the Soviets be informed of the Manhattan Project’s efforts and, according to Martin Sherwin and Kai Bird’s American Prometheus, had “assumed that such forthright discussions were taking place at that very moment” at the conference in Potsdam where, Oppenheimer “was later appalled to learn” that Harry Truman had only vaguely mentioned the bomb to Joseph Stalin, scotching the first opportunity for international nuclear cooperation.
Oppenheimer continued to take up the cause of international cooperation, working as the lead advisor for Dean Acheson and David Lilienthal on their 1946 nuclear control proposal, which would never get accepted by the United Nations and, namely, the Soviet Union after it was amended by Truman’s appointed U.N. representative Bernard Baruch to be more favorable to the United States.
In view of the next 50 years of nuclear history — further proliferation, the development of thermonuclear weapons that could be mounted on missiles that were likely impossible to shoot down — the proposals Oppenheimer developed seem utopian: The U.N. would "bring under its complete control world supplies of uranium and thorium," including all mining, and would control all nuclear reactors. This scheme would also make the construction of new weapons impossible, lest other nations build their own.
By the end of 1946, the Baruch proposal had died along with any prospect of international control of nuclear power, all the while the Soviets were working intensely to disrupt America’s nuclear monopoly — with the help of information ferried out of Los Alamos — by successfully testing a weapon before the end of the decade.
With the failure of international arms control and the beginning of the arms race, Oppenheimer’s vision of a post-Trinity world would come to shambles. For Teller, however, it was a great opportunity.
While Oppenheimer planned to stave off nuclear annihilation through international cooperation, Teller was trying to build a bigger deterrent.
Since the early stages of the Manhattan Project, Teller had been dreaming of a fusion weapon many times more powerful than the first atomic bombs, what was then called the “Super.” When the atomic bomb was completed, he would again push for the creation of a thermonuclear bomb, but the efforts stalled thanks to technical and theoretical issues with Teller’s proposed design.
Nolan captures Teller’s early comprehension of just how powerful nuclear weapons can be. In a scene that’s pulled straight from accounts of the Trinity blast, most of the scientists who view the test are either in bunkers wearing welding goggles or following instructions to lie down, facing away from the blast. Not so for Teller. He lathers sunscreen on his face, straps on a pair of dark goggles, and views the explosion straight on, even pursing his lips as the explosion lights up the desert night brighter than the sun.
And it was that power — the sun’s — that Teller wanted to harness in pursuit of his “Super,” where a bomb’s power would be derived from fusing together hydrogen atoms, creating helium — and a great deal of energy. It would even use a fission bomb to help ignite the process.
Oppenheimer and several scientific luminaries, including Manhattan Project scientists Enrico Fermi and Isidor Rabi, opposed the bomb, issuing in their official report on their positions advising the Atomic Energy Commission in 1949 statements that the hydrogen bomb was infeasible, strategically useless, and potentially a weapon of “genocide.”
But by 1950, thanks in part to Teller and the advocacy of Lewis Strauss, a financier turned government official and the approximate villain of Nolan’s film, Harry Truman would sign off on a hydrogen bomb project, resulting in the 1952 “Ivy Mike” test where a bomb using a design from Teller and mathematician Stan Ulam would vaporize the Pacific Island Elugelab with a blast about 700 times more powerful than the one that destroyed Hiroshima.
The success of the project re-ignited doubts around Oppenheimer’s well-known left-wing political associations in the years before the war and, thanks to scheming by Strauss, he was denied a renewed security clearance.
While several Manhattan Project scientists testified on his behalf, Teller did not, saying, “I thoroughly disagreed with him in numerous issues and his actions frankly appeared to me confused and complicated.”
It was the end of Oppenheimer’s public career. The New Deal Democrat had been eclipsed by Teller, who would become the scientific avatar of the Reagan Republicans.
For the next few decades, Teller would stay close to politicians, the military, and the media, exercising a great deal of influence over arms policy for several decades from the Lawrence Livermore National Laboratory, which he helped found, and his academic perch at the University of California.
He pooh-poohed the dangers of radiation, supported the building of more and bigger bombs that could be delivered by longer and longer range missiles, and opposed prohibitions on testing. When Dwight Eisenhower was considering a negotiated nuclear test ban, Teller faced off against future Nobel laureate and Manhattan Project alumnus Hans Bethe over whether nuclear tests could be hidden from detection by conducting them underground in a massive hole; the eventual 1963 test ban treaty would exempt underground testing.
As the Cold War settled into a nuclear standoff with both the United States and the Soviet Union possessing enough missiles and nuclear weapons to wipe out the other, Teller didn’t look to treaties, limitations, and cooperation to solve the problem of nuclear brinksmanship, but instead to space: He wanted to neutralize the threat of a Soviet first strike using x-ray lasers from space powered by nuclear explosions (he was again opposed by Bethe and the x-ray lasers never came to fruition).
He also notoriously dreamed up Project Plowshare, the civilian nuclear project which would get close to nuking out a new harbor in Northern Alaska and actually did attempt to extract gas in New Mexico and Colorado using nuclear explosions.
Yet, in perhaps the strangest turn of all, Teller also became something of a key figure in the history of climate change research, both in his relatively early awareness of the problem and the conceptual gigantism he brought to proposing to solve it.
While publicly skeptical of climate change later in his life, Teller was starting to think about climate change, decades before James Hansen’s seminal 1988 Congressional testimony.
The researcher and climate litigator Benajmin Franta made the startling archival discovery that Teller had given a speech at an oil industry event in 1959 where he warned “energy resources will run short as we use more and more of the fossil fuels,” and, after explaining the greenhouse effect, he said that “it has been calculated that a temperature rise corresponding to a 10 percent increase in carbon dioxide will be sufficient to melt the icecap and submerge New York … I think that this chemical contamination is more serious than most people tend to believe.”
Teller was also engaged with issues around energy and other “peaceful” uses of nuclear power. In response to concerns about the dangers of nuclear reactors, he in the 1960s began advocating putting them underground, and by the early 1990s proposed running said underground nuclear reactors automatically in order to avoid the human error he blamed for the disasters at Chernobyl and Three Mile Island.
While Teller was always happy to find some collaborators to almost throw off an ingenious-if-extreme solution to a problem, there is a strain of “Tellerism,” both institutionally and conceptually, that persists to this day in climate science and energy policy.
Nuclear science and climate science had long been intertwined, Stanford historian Paul Edwards writes, including that the “earliest global climate models relied on numerical methods very similar to those developed by nuclear weapons designers for solving the fluid dynamics equations needed to analyze shock waves produced in nuclear explosions.”
Where Teller comes in is in the role that Lawrence Livermore played in both its energy research and climate modeling. “With the Cold War over and research on nuclear weapons in decline, the national laboratories faced a quandary: What would justify their continued existence?” Edwards writes. The answer in many cases would be climate change, due to these labs’ ample collection of computing power, “expertise in numerical modeling of fluid dynamics, and their skills in managing very large data sets.”
One of those labs was Livermore, the institution founded by Teller, a leading center of climate and energy modeling and research since the late 1980s. “[Teller] was very enthusiastic about weather control,” early climate modeler Cecil “Chuck” Leith told Edwards in an oral history.
The Department of Energy writ large, which inherited much of the responsibilities of the Atomic Energy Commission, is now one of the lead agencies on climate change policy and energy research.
Which brings us to fusion.
It was Teller’s Lawrence Livermore National Laboratory that earlier this year successfully got more power out of a controlled fusion reaction than it put in — and it was Energy Secretary Jennifer Granholm who announced it, calling it the “holy grail” of clean energy development.
Teller’s journey with fusion is familiar to its history: early cautious optimism followed by a realization that it would likely not be achieved soon. As early as 1958, he said in a speech that he had been discussing “controlled fusion” at Los Alamos and that “thermonuclear energy generation is possible,” although he admitted that “the problem is not quite easy” and by 1987 had given up on seeing it realized during his lifetime.
Still, what controlled fusion we do have at Livermore’s National Ignition Facility owes something to Teller and the technology he pioneered in the hydrogen bomb, according to physicist NJ Fisch.
While fusion is one infamous technological fix for the problem of clean and cheap energy production, Teller and the Livermore cadres were also a major influence on the development of solar geoengineering, the idea that global warming could be averted not by reducing the emissions of greenhouse gas into the atmosphere, but by making the sun less intense.
In a mildly trolling column for the Wall Street Journal in January 1998, Teller professed agnosticism on climate change (despite giving that speech to oil executives three decades prior) but proposed an alternative policy that would be “far less burdensome than even a system of market-allocated emissions permits”: solar geoengineering with “fine particles.”
The op-ed placed in the conservative pages of the Wall Street Journal was almost certainly an effort to oppose the recently signed Kyoto Protocol, but the ideas have persisted among thinkers and scientists whose engagement with environmental issues went far beyond their own opinion about Al Gore and by extension the environmental movement as a whole (Teller’s feelings about both were negative).
But his proposal would be familiar to the climate debates of today: particle emissions that would scatter sunlight and thus lower atmospheric temperatures. If climate change had to be addressed, Teller argued, “let us play to our uniquely American strengths in innovation and technology to offset any global warming by the least costly means possible.”
A paper he wrote with two colleagues that was an early call for spraying sulfates in the stratosphere also proposed “deploying electrically-conducting sheeting, either in the stratosphere or in low Earth orbit.” These were “literally diaphanous shattering screens,” that could scatter enough sunlight in order to reduce global warming — one calculation Teller made concludes that 46 million square miles, or about 1 percent of the surface area of the Earth, of these screens would be necessary.
The climate scientist and Livermore alumnus Ken Caldeira has attributed his own initial interest in solar geoengineering to Lowell Wood, a Livermore researcher and Teller protégé. While often seen as a centrist or even a right wing idea in order to avoid the more restrictionist policies on carbon emissions, solar geoengineering has sparked some interest on the left, including in socialist science fiction author Kim Stanley Robinson’s The Ministry for the Future, which envisions India unilaterally pumping sulfates into the atmosphere in response to a devastating heat wave.
The White House even quietly released a congressionally-mandated report on solar geoengineering earlier this spring, outlining avenues for further research.
While the more than 30 years since the creation of the Intergovernmental Panel on Climate Change and the beginnings of Kyoto Protocol have emphasized international cooperation on both science and policymaking through agreed upon goals in emissions reductions, the technological temptation is always present.
And here we can perhaps see that the split between the moralized scientists and their pleas for addressing the problems of the arms race through scientific openness and international cooperation and those of the hawkish technicians, who wanted to press the United States’ technical advantage in order to win the nuclear standoff and ultimately the Cold War through deterrence.
With the IPCC and the United Nations Climate Conference, through which emerged the Kyoto Protocol and the Paris Agreement, we see a version of what the postwar scientists wanted applied to the problem of climate change. Nations come together and agree on targets for controlling something that may benefit any one of them but risks global calamity. The process is informed by scientists working with substantial resources across national borders who play a major role in formulating and verifying the policy mechanisms used to achieve these goals.
But for almost as long as climate change has been an issue of international concern, the Tellerian path has been tempting. While Teller’s dreams of massive sun-scattering sheets, nuclear earth engineering, and automated underground reactors are unlikely to be realized soon, if at all, you can be sure there are scientists and engineers looking straight into the light. And they may one day drag us into it, whether we want to or not.
Editor’s note: An earlier version of this article misstated the name of a climate modeler. It’s been corrected. We regret the error.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Any household savings will barely make a dent in the added costs from Trump’s many tariffs.
Donald Trump’s tariffs — the “fentanyl” levies on Canada, China, and Mexico, the “reciprocal” tariffs on nearly every country (and some uninhabited islands), and the global 10% tariff — will almost certainly cause consumer goods on average to get more expensive. The Yale Budget Lab estimates that in combination, the tariffs Trump has announced so far in his second term will cause prices to rise 2.3%, reducing purchasing power by $3,800 per year per household.
But there’s one very important consumer good that seems due to decline in price.
Trump administration officials — including the president himself — have touted cheaper oil to suggest that the economic response to the tariffs hasn’t been all bad. On Sunday, Secretary of the Treasury Scott Bessent told NBC, “Oil prices went down almost 15% in two days, which impacts working Americans much more than the stock market does.”
Trump picked up this line on Truth Social Monday morning. “Oil prices are down, interest rates are down (the slow moving Fed should cut rates!), food prices are down, there is NO INFLATION,” he wrote. He then spent the day posting quotes from Fox Business commentators echoing that idea, first Maria Bartiromo (“Rates are plummeting, oil prices are plummeting, deregulation is happening. President Trump is not going to bend”) then Charles Payne (“What we’re not talking about is, oil was $76, now it’s $65. Gasoline prices are going to plummet”).
But according to Neil Dutta, head of economic research at Renaissance Macro Research, pointing to falling oil prices as a stimulus is just another example of the “4D chess” theory, under which some market participants attribute motives to Trump’s trade policy beyond his stated goal of reducing trade deficits to as near zero (or surplus!) as possible.
Instead, oil markets are primarily “responding to the recession risk that comes from the tariff and the trade war,” Dutta told me. “That is the main story.” In short, oil markets see less global trade and less global production, and therefore falling demand for oil. The effect on household consumption, he said, was a “second order effect.”
It is true that falling oil prices will help “stabilize consumption,” Dutta told me (although they could also devastate America’s own oil industry). “It helps. It’ll provide some lift to real income growth for consumers, because they’re not spending as much on gasoline.” But “to fully offset the trade war effects, you basically need to get oil down to zero.”
That’s confirmed by some simple and extremely back of the envelope math. In 2023, households on average consumed about 700 gallons of gasoline per year, based on Energy Information Administration calculations that the average gasoline price in 2023 was $3.52, while the Bureau of Labor Statistics put average household gasoline expenditures at about $2,450.
Let’s generously assume that due to the tariffs and Trump’s regulatory and diplomatic efforts, gas prices drop from the $3.26 they were at on Monday, according to AAA, to $2.60, the average price in 2019. (GasBuddy petroleum analyst Patrick De Haanwrote Monday that the tariffs combined with OPEC+ production hikes could lead gas prices “to fall below $3 per gallon.”)
Let’s also assume that this drop in gas prices does not cause people to drive more or buy less fuel-efficient vehicles. In that case, those same 700 gallons cost the average American $1,820, which would generate annual savings of $630 on average per household. If we went to the lowest price since the Russian invasion of Ukraine, about $3 per gallon, total consumption of 700 gallons would cost a household about $2,100, saving $350 per household per year.
That being said, $1,820 is a pretty low level for annual gasoline consumption. In 2021, as the economy was recovering from the Covid recession and before gas prices popped, annual gasoline expenditures only got as low as $1,948; in 2020 — when oil prices dropped to literally negative dollars per barrel and gas prices got down to $1.85 a gallon — annual expenditures were just over $1,500.
In any case, if you remember the opening paragraphs of this story, even the most generous estimated savings would go nowhere near surmounting the overall rise in prices forecast by the Yale Budget Lab. $630 is less than $3,800! (JPMorgan has forecast a more mild increase in prices of 1% to 1.5%, but agrees that prices will likely rise and purchasing power will decline.)
But maybe look at it this way: You might be able to drive a little more than you expected to, even as your costs elsewhere are going up. Just please be careful! You don’t want to get into a bad accident and have to replace your car: New car prices are expected to rise by several thousand dollars due to Trump’s tariffs.
With cars about to get more expensive, it might be time to start tinkering.
More than a decade ago, when I was a young editor at Popular Mechanics, we got a Nissan Leaf. It was a big deal. The magazine had always kept long-term test cars to give readers a full report of how they drove over weeks and months. A true test of the first true production electric vehicle from a major car company felt like a watershed moment: The future was finally beginning. They even installed a destination charger in the basement of the Hearst Corporation’s Manhattan skyscraper.
That Leaf was a bit of a lump, aesthetically and mechanically. It looked like a potato, got about 100 miles of range, and delivered only 110 horsepower or so via its electric motors. This made the O.G. Leaf a scapegoat for Top Gear-style car enthusiasts eager to slander EVs as low-testosterone automobiles of the meek, forced upon an unwilling population of drivers. Once the rise of Tesla in the 2010s had smashed that paradigm and led lots of people to see electric vehicles as sexy and powerful, the original Leaf faded from the public imagination, a relic of the earliest days of the new EV revolution.
Yet lots of those cars are still around. I see a few prowling my workplace parking garage or roaming the streets of Los Angeles. With the faded performance of their old batteries, these long-running EVs aren’t good for much but short-distance city driving. Ignore the outdated battery pack for a second, though, and what surrounds that unit is a perfectly serviceable EV.
That’s exactly what a new brand of EV restorers see. Last week, car site The Autopiancovered DIYers who are scooping up cheap old Leafs, some costing as little as $3,000, and swapping in affordable Chinese-made 62 kilowatt-hour battery units in place of the original 24 kilowatt-hour units to instantly boost the car’s range to about 250 miles. One restorer bought a new battery on the Chinese site Alibaba for $6,000 ($4,500, plus $1,500 to ship that beast across the sea).
The possibility of the (relatively) simple battery swap is a longtime EV owner’s daydream. In the earlier days of the electrification race, many manufacturers and drivers saw simple and quick battery exchange as the solution for EV road-tripping. Instead of waiting half an hour for a battery to recharge, you’d swap your depleted unit for a fully charged one and be on your way. Even Tesla tested this approach last decade before settling for good on the Supercharger network of fast-charging stations.
There are still companies experimenting with battery swaps, but this technology lost. Other EV startups and legacy car companies that followed Nissan and Tesla into making production EVs embraced the rechargeable lithium-ion battery that is meant to be refilled at a fast-charging station and is not designed to be easily removed from the vehicle. Buy an electric vehicle and you’re buying a big battery with a long warranty but no clear plan for replacement. The companies imagine their EVs as something like a smartphone: It’s far from impossible to replace the battery and give the car a new life, but most people won’t bother and will simply move on to a new car when they can’t take the limitations of their old one anymore.
I think about this impasse a lot. My 2019 Tesla Model 3 began its life with a nominal 240 miles of range. Now that the vehicle has nearly six years and 70,000 miles on it, its maximum range is down to just 200, while its functional range at highway speed is much less than that. I don’t want to sink money into another vehicle, which means living with an EV’s range that diminishes as the years go by.
But what if, one day, I replaced its battery? Even if it costs thousands of dollars to achieve, a big range boost via a new battery would make an older EV feel new again, and at a cost that’s still far less than financing a whole new car. The thought is even more compelling in the age of Trump-imposed tariffs that will raise already-expensive new vehicles to a place that’s simply out of reach for many people (though new battery units will be heavily tariffed, too).
This is no simple weekend task. Car enthusiasts have been swapping parts and modifying gas-burning vehicles since the dawn of the automotive age, but modern EVs aren’t exactly made with the garage mechanic in mind. Because so few EVs are on the road, there is a dearth of qualified mechanics and not a huge population of people with the savvy to conduct major surgery on an electric car without electrocuting themselves. A battery-replacing owner would need to acquire not only the correct pack but also potentially adapters and other equipment necessary to make the new battery play nice with the older car. Some Nissan Leaf modifiers are finding their replacement packs aren’t exactly the same size, shape or weight, The Autopian says, meaning they need things like spacers to make the battery sit in just the right place.
A new battery isn’t a fix-all either. The motors and other electrical components wear down and will need to be replaced eventually, too. A man in Norway who drove his Tesla more than a million miles has replaced at least four battery packs and 14 motors, turning his EV into a sort of car of Theseus.
Crucially, though, EVs are much simpler, mechanically, than combustion-powered cars, what with the latter’s belts and spark plugs and thousands of moving parts. The car that surrounds a depleted battery pack might be in perfectly good shape to keep on running for thousands of miles to come if the owner were to install a new unit, one that could potentially give the EV more driving range than it had when it was new.
The battery swap is still the domain of serious top-tier DIYers, and not for the mildly interested or faint of heart. But it is a sign of things to come. A market for very affordable used Teslas is booming as owners ditch their cars at any cost to distance themselves from Elon Musk. Old Leafs, Chevy Bolts and other EVs from the 2010s can be had for cheap. The generation of early vehicles that came with an unacceptably low 100 to 150 miles of range would look a lot more enticing if you imagine today’s battery packs swapped into them. The possibility of a like-new old EV will look more and more promising, especially as millions of Americans realize they can no longer afford a new car.
On the shifting energy mix, tariff impacts, and carbon capture
Current conditions: Europe just experienced its warmest March since record-keeping began 47 years ago • It’s 105 degrees Fahrenheit in India’s capital Delhi where heat warnings are in effect • The risk of severe flooding remains high across much of the Mississippi and Ohio Valleys.
The severe weather outbreak that has brought tornadoes, extreme rainfall, hail, and flash flooding to states across the central U.S. over the past week has already caused between $80 billion and $90 billion in damages and economic losses, according to a preliminary estimate from AccuWeather. The true toll is likely to be costlier because some areas have yet to report their damages, and the flooding is ongoing. “A rare atmospheric river continually resupplying a firehose of deep tropical moisture into the central U.S., combined with a series of storms traversing the same area in rapid succession, created a ‘perfect storm’ for catastrophic flooding and devastating tornadoes,” said AccuWeather’s chief meteorologist Jonathan Porter. The estimate takes into account damages to buildings and infrastructure, as well as secondary effects like supply chain and shipping disruptions, extended power outages, and travel delays. So far 23 people are known to have died in the storms. “This is the third preliminary estimate for total damage and economic loss that AccuWeather experts have issued so far this year,” the outlet noted in a release, “outpacing the frequency of major, costly weather disasters since AccuWeather began issuing estimates in 2017.”
AccuWeather
Low-emission energy sources accounted for 41% of global electricity generation in 2024, up from 39.4% in 2023, according to energy think tank Ember’s annual Global Electricity Review. That includes renewables as well as nuclear. If nuclear is left out of the equation, renewables alone made up 32% of power generation last year. Overall, renewables added a record 858 terawatt hours, nearly 50% more than the previous record set in 2022. Hydro was the largest source of low-carbon power, followed by nuclear. But wind and solar combined overtook hydro last year, while nuclear’s share of the energy mix reached a 45-year low. More solar capacity was installed in 2024 than in any other single year.
Ember
The report notes that demand for electricity rose thanks to heat waves and air conditioning use. This resulted in a slight, 1.4% annual increase in fossil-fuel power generation and pushed power-sector emissions to a new all-time high of 14.5 billion metric tons. “Clean electricity generation met 96% of the demand growth not caused by hotter temperatures,” the report said.
President Trump’s new tariffs will have a “limited” effect on the amount of solar components the U.S. imports from Asia because the U.S. already imposes tariffs on these products, according to a report from research firm BMI. That said, the U.S. still relies heavily on imported solar cells, and the new fees are likely to raise costs for domestic manufacturers and developers, which will ultimately be passed on to buyers and could slow solar growth. “Since the U.S.’s manufacturing capacity is insufficient to meet demand for solar, wind, and grid components, we do expect that costs will increase for developers due to the tariffs which will now be imposed upon these components,” BMI wrote.
In other tariff news, the British government is adjusting its 2030 target of ending the sale of new internal combustion engine cars to ease some of the pain from President Trump’s new 25% auto tariffs. Under the U.K.’s new EV mandate, carmakers will be able to sell new hybrids through 2035 (whereas the previous version of the rules banned them by 2030), and gas and diesel vans can also be sold through 2035. The changes also carve out exemptions for luxury supercar brands like McLaren and Aston Martin, which will be allowed to keep selling new ICE vehicles beyond 2030 because, the government says, they produce so few. The goal is to “help ease the transition and give industry more time to prepare.” British Transport Secretary Heidi Alexander insisted the changes have been “carefully calibrated” and their impact on carbon emissions is “negligible.” As The New York Timesnoted, the U.S. is the largest single-country export market for British cars.
The Environmental Protection Agency has approved Occidental Petroleum’s application to capture and sequester carbon dioxide at its direct air capture facility in Texas, and issued permits that will allow the company to drill and inject the gas more than one mile underground. The Stratos DAC plant is being developed by Occidental subsidiary 1PointFive. As Heatmap’s Katie Brigham has reported, Stratos is designed to remove up to 500,000 metric tons of CO2 annually and set to come online later this year. Its success (or failure) could shape the future of DAC investment at a time when the Trump administration is hollowing out the Department of Energy’s nascent Carbon Dioxide Removal team and casting doubt over the future of the DOE’s $3.5 billion Regional Direct Air Capture Hubs program. While Stratos is not a part of the hubs program, it will use the same technology as Occidental’s South Texas DAC hub.
The Bezos Earth Fund and the Global Methane Hub are launching a $27 million effort to fund research into selectively breeding cattle that emit less methane.