You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
Is international cooperation or technological development the answer to an apocalyptic threat?
Christopher Nolan’s film Oppenheimer is about the great military contest of the Second World War, but only in the background. It’s really about a clash of visions for a postwar world defined by the physicist J. Robert Oppenheimer’s work at Los Alamos and beyond. The great power unleashed by the bombs at Hiroshima and Nagasaki could be dwarfed by what knowledge of nuclear physics could produce in the coming years, risking a war more horrifying than the one that had just concluded.
Oppenheimer, and many of his fellow atomic scientists, would spend much of the postwar period arguing for international cooperation, scientific openness, and nuclear restriction. But there was another cadre of scientists, exemplified by a former colleague turned rival, Edward Teller, that sought to answer the threat of nuclear annihilation with new technology — including even bigger bombs.
As the urgency of the nuclear question declined with the end of the Cold War, the scientific community took up a new threat to global civilization: climate change. While the conflict mapped out in Oppenheimer was over nuclear weapons, the clash of visions, which ended up burying Oppenheimer and elevating Teller, also maps out to the great debate over global warming: Should we reach international agreements to cooperatively reduce carbon emissions or should we throw our — and specifically America’s — great resources into a headlong rush of technological development? Should we massively overhaul our energy system or make the sun a little less bright?
Oppenheimer’s dream of international cooperation to prevent a nuclear arms race was born even before the Manhattan Project culminated with the Trinity test. Oppenheimer and Danish physicist Niels Bohr “believed that an agreement between the wartime allies based upon the sharing of information, including the existence of the Manhattan Project, could prevent the surfacing of a nuclear-armed world,” writes Marco Borghi in a Wilson Institute working paper.
Oppenheimer even suggested that the Soviets be informed of the Manhattan Project’s efforts and, according to Martin Sherwin and Kai Bird’s American Prometheus, had “assumed that such forthright discussions were taking place at that very moment” at the conference in Potsdam where, Oppenheimer “was later appalled to learn” that Harry Truman had only vaguely mentioned the bomb to Joseph Stalin, scotching the first opportunity for international nuclear cooperation.
Oppenheimer continued to take up the cause of international cooperation, working as the lead advisor for Dean Acheson and David Lilienthal on their 1946 nuclear control proposal, which would never get accepted by the United Nations and, namely, the Soviet Union after it was amended by Truman’s appointed U.N. representative Bernard Baruch to be more favorable to the United States.
In view of the next 50 years of nuclear history — further proliferation, the development of thermonuclear weapons that could be mounted on missiles that were likely impossible to shoot down — the proposals Oppenheimer developed seem utopian: The U.N. would "bring under its complete control world supplies of uranium and thorium," including all mining, and would control all nuclear reactors. This scheme would also make the construction of new weapons impossible, lest other nations build their own.
By the end of 1946, the Baruch proposal had died along with any prospect of international control of nuclear power, all the while the Soviets were working intensely to disrupt America’s nuclear monopoly — with the help of information ferried out of Los Alamos — by successfully testing a weapon before the end of the decade.
With the failure of international arms control and the beginning of the arms race, Oppenheimer’s vision of a post-Trinity world would come to shambles. For Teller, however, it was a great opportunity.
While Oppenheimer planned to stave off nuclear annihilation through international cooperation, Teller was trying to build a bigger deterrent.
Since the early stages of the Manhattan Project, Teller had been dreaming of a fusion weapon many times more powerful than the first atomic bombs, what was then called the “Super.” When the atomic bomb was completed, he would again push for the creation of a thermonuclear bomb, but the efforts stalled thanks to technical and theoretical issues with Teller’s proposed design.
Nolan captures Teller’s early comprehension of just how powerful nuclear weapons can be. In a scene that’s pulled straight from accounts of the Trinity blast, most of the scientists who view the test are either in bunkers wearing welding goggles or following instructions to lie down, facing away from the blast. Not so for Teller. He lathers sunscreen on his face, straps on a pair of dark goggles, and views the explosion straight on, even pursing his lips as the explosion lights up the desert night brighter than the sun.
And it was that power — the sun’s — that Teller wanted to harness in pursuit of his “Super,” where a bomb’s power would be derived from fusing together hydrogen atoms, creating helium — and a great deal of energy. It would even use a fission bomb to help ignite the process.
Oppenheimer and several scientific luminaries, including Manhattan Project scientists Enrico Fermi and Isidor Rabi, opposed the bomb, issuing in their official report on their positions advising the Atomic Energy Commission in 1949 statements that the hydrogen bomb was infeasible, strategically useless, and potentially a weapon of “genocide.”
But by 1950, thanks in part to Teller and the advocacy of Lewis Strauss, a financier turned government official and the approximate villain of Nolan’s film, Harry Truman would sign off on a hydrogen bomb project, resulting in the 1952 “Ivy Mike” test where a bomb using a design from Teller and mathematician Stan Ulam would vaporize the Pacific Island Elugelab with a blast about 700 times more powerful than the one that destroyed Hiroshima.
The success of the project re-ignited doubts around Oppenheimer’s well-known left-wing political associations in the years before the war and, thanks to scheming by Strauss, he was denied a renewed security clearance.
While several Manhattan Project scientists testified on his behalf, Teller did not, saying, “I thoroughly disagreed with him in numerous issues and his actions frankly appeared to me confused and complicated.”
It was the end of Oppenheimer’s public career. The New Deal Democrat had been eclipsed by Teller, who would become the scientific avatar of the Reagan Republicans.
For the next few decades, Teller would stay close to politicians, the military, and the media, exercising a great deal of influence over arms policy for several decades from the Lawrence Livermore National Laboratory, which he helped found, and his academic perch at the University of California.
He pooh-poohed the dangers of radiation, supported the building of more and bigger bombs that could be delivered by longer and longer range missiles, and opposed prohibitions on testing. When Dwight Eisenhower was considering a negotiated nuclear test ban, Teller faced off against future Nobel laureate and Manhattan Project alumnus Hans Bethe over whether nuclear tests could be hidden from detection by conducting them underground in a massive hole; the eventual 1963 test ban treaty would exempt underground testing.
As the Cold War settled into a nuclear standoff with both the United States and the Soviet Union possessing enough missiles and nuclear weapons to wipe out the other, Teller didn’t look to treaties, limitations, and cooperation to solve the problem of nuclear brinksmanship, but instead to space: He wanted to neutralize the threat of a Soviet first strike using x-ray lasers from space powered by nuclear explosions (he was again opposed by Bethe and the x-ray lasers never came to fruition).
He also notoriously dreamed up Project Plowshare, the civilian nuclear project which would get close to nuking out a new harbor in Northern Alaska and actually did attempt to extract gas in New Mexico and Colorado using nuclear explosions.
Yet, in perhaps the strangest turn of all, Teller also became something of a key figure in the history of climate change research, both in his relatively early awareness of the problem and the conceptual gigantism he brought to proposing to solve it.
While publicly skeptical of climate change later in his life, Teller was starting to think about climate change, decades before James Hansen’s seminal 1988 Congressional testimony.
The researcher and climate litigator Benajmin Franta made the startling archival discovery that Teller had given a speech at an oil industry event in 1959 where he warned “energy resources will run short as we use more and more of the fossil fuels,” and, after explaining the greenhouse effect, he said that “it has been calculated that a temperature rise corresponding to a 10 percent increase in carbon dioxide will be sufficient to melt the icecap and submerge New York … I think that this chemical contamination is more serious than most people tend to believe.”
Teller was also engaged with issues around energy and other “peaceful” uses of nuclear power. In response to concerns about the dangers of nuclear reactors, he in the 1960s began advocating putting them underground, and by the early 1990s proposed running said underground nuclear reactors automatically in order to avoid the human error he blamed for the disasters at Chernobyl and Three Mile Island.
While Teller was always happy to find some collaborators to almost throw off an ingenious-if-extreme solution to a problem, there is a strain of “Tellerism,” both institutionally and conceptually, that persists to this day in climate science and energy policy.
Nuclear science and climate science had long been intertwined, Stanford historian Paul Edwards writes, including that the “earliest global climate models relied on numerical methods very similar to those developed by nuclear weapons designers for solving the fluid dynamics equations needed to analyze shock waves produced in nuclear explosions.”
Where Teller comes in is in the role that Lawrence Livermore played in both its energy research and climate modeling. “With the Cold War over and research on nuclear weapons in decline, the national laboratories faced a quandary: What would justify their continued existence?” Edwards writes. The answer in many cases would be climate change, due to these labs’ ample collection of computing power, “expertise in numerical modeling of fluid dynamics, and their skills in managing very large data sets.”
One of those labs was Livermore, the institution founded by Teller, a leading center of climate and energy modeling and research since the late 1980s. “[Teller] was very enthusiastic about weather control,” early climate modeler Cecil “Chuck” Leith told Edwards in an oral history.
The Department of Energy writ large, which inherited much of the responsibilities of the Atomic Energy Commission, is now one of the lead agencies on climate change policy and energy research.
Which brings us to fusion.
It was Teller’s Lawrence Livermore National Laboratory that earlier this year successfully got more power out of a controlled fusion reaction than it put in — and it was Energy Secretary Jennifer Granholm who announced it, calling it the “holy grail” of clean energy development.
Teller’s journey with fusion is familiar to its history: early cautious optimism followed by a realization that it would likely not be achieved soon. As early as 1958, he said in a speech that he had been discussing “controlled fusion” at Los Alamos and that “thermonuclear energy generation is possible,” although he admitted that “the problem is not quite easy” and by 1987 had given up on seeing it realized during his lifetime.
Still, what controlled fusion we do have at Livermore’s National Ignition Facility owes something to Teller and the technology he pioneered in the hydrogen bomb, according to physicist NJ Fisch.
While fusion is one infamous technological fix for the problem of clean and cheap energy production, Teller and the Livermore cadres were also a major influence on the development of solar geoengineering, the idea that global warming could be averted not by reducing the emissions of greenhouse gas into the atmosphere, but by making the sun less intense.
In a mildly trolling column for the Wall Street Journal in January 1998, Teller professed agnosticism on climate change (despite giving that speech to oil executives three decades prior) but proposed an alternative policy that would be “far less burdensome than even a system of market-allocated emissions permits”: solar geoengineering with “fine particles.”
The op-ed placed in the conservative pages of the Wall Street Journal was almost certainly an effort to oppose the recently signed Kyoto Protocol, but the ideas have persisted among thinkers and scientists whose engagement with environmental issues went far beyond their own opinion about Al Gore and by extension the environmental movement as a whole (Teller’s feelings about both were negative).
But his proposal would be familiar to the climate debates of today: particle emissions that would scatter sunlight and thus lower atmospheric temperatures. If climate change had to be addressed, Teller argued, “let us play to our uniquely American strengths in innovation and technology to offset any global warming by the least costly means possible.”
A paper he wrote with two colleagues that was an early call for spraying sulfates in the stratosphere also proposed “deploying electrically-conducting sheeting, either in the stratosphere or in low Earth orbit.” These were “literally diaphanous shattering screens,” that could scatter enough sunlight in order to reduce global warming — one calculation Teller made concludes that 46 million square miles, or about 1 percent of the surface area of the Earth, of these screens would be necessary.
The climate scientist and Livermore alumnus Ken Caldeira has attributed his own initial interest in solar geoengineering to Lowell Wood, a Livermore researcher and Teller protégé. While often seen as a centrist or even a right wing idea in order to avoid the more restrictionist policies on carbon emissions, solar geoengineering has sparked some interest on the left, including in socialist science fiction author Kim Stanley Robinson’s The Ministry for the Future, which envisions India unilaterally pumping sulfates into the atmosphere in response to a devastating heat wave.
The White House even quietly released a congressionally-mandated report on solar geoengineering earlier this spring, outlining avenues for further research.
While the more than 30 years since the creation of the Intergovernmental Panel on Climate Change and the beginnings of Kyoto Protocol have emphasized international cooperation on both science and policymaking through agreed upon goals in emissions reductions, the technological temptation is always present.
And here we can perhaps see that the split between the moralized scientists and their pleas for addressing the problems of the arms race through scientific openness and international cooperation and those of the hawkish technicians, who wanted to press the United States’ technical advantage in order to win the nuclear standoff and ultimately the Cold War through deterrence.
With the IPCC and the United Nations Climate Conference, through which emerged the Kyoto Protocol and the Paris Agreement, we see a version of what the postwar scientists wanted applied to the problem of climate change. Nations come together and agree on targets for controlling something that may benefit any one of them but risks global calamity. The process is informed by scientists working with substantial resources across national borders who play a major role in formulating and verifying the policy mechanisms used to achieve these goals.
But for almost as long as climate change has been an issue of international concern, the Tellerian path has been tempting. While Teller’s dreams of massive sun-scattering sheets, nuclear earth engineering, and automated underground reactors are unlikely to be realized soon, if at all, you can be sure there are scientists and engineers looking straight into the light. And they may one day drag us into it, whether we want to or not.
Editor’s note: An earlier version of this article misstated the name of a climate modeler. It’s been corrected. We regret the error.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
The Bureau of Land Management says it will be heavily scrutinizing transmission lines if they are expressly necessary to bring solar or wind energy to the power grid.
Since the beginning of July, I’ve been reporting out how the Trump administration has all but halted progress for solar and wind projects on federal lands through a series of orders issued by the Interior Department. But last week, I explained it was unclear whether transmission lines that connect to renewable energy projects would be subject to the permitting freeze. I also identified a major transmission line in Nevada – the north branch of NV Energy’s Greenlink project – as a crucial test case for the future of transmission siting in federal rights-of-way under Trump. Greenlink would cross a litany of federal solar leases and has been promoted as “essential to helping Nevada achieve its de-carbonization goals and increased renewable portfolio standard.”
Well, BLM has now told me Greenlink North will still proceed despite a delay made public shortly after permitting was frozen for renewables, and that the agency still expects to publish the record of decision for the line in September.
This is possible because, as BLM told me, transmission projects that bring solar and wind power to the grid will be subject to heightened scrutiny. In an exclusive statement, BLM press secretary Brian Hires told me via e-mail that a secretarial order choking out solar and wind permitting on federal lands will require “enhanced environmental review for transmission lines only when they are a part of, and necessary for, a wind or solar energy project.”
However, if a transmission project is not expressly tied to wind or solar or is not required for those projects to be constructed… apparently, then it can still get a federal green light. For instance in the case of Greenlink, the project itself is not explicitly tied to any single project, but is kind of like a transmission highway alongside many potential future solar projects. So a power line can get approved if it could one day connect to wind or solar, but the line’s purpose cannot solely be for a wind or solar project.
This is different than, say, lines tied explicitly to connecting a wind or solar project to an existing transmission network. Known as gen-tie lines, these will definitely face hardships with this federal government. This explains why, for example, BLM has yet to approve a gen-tie line for a wind project in Wyoming that would connect the Lucky Star wind project to the grid.
At the same time, it appears projects may be given a wider berth if a line has other reasons for existing, like improving resilience on the existing grid, or can be flexibly used by not just renewables but also fossil energy.
So, the lesson to me is that if you’re trying to build transmission infrastructure across federal property under this administration, you might want to be a little more … vague.
Tech companies, developers, and banks are converging behind “flexible loads.”
Electricity prices are up by over 5% so far this year — more than twice the overall rate of inflation — while utilities have proposed $29 billion worth of rate hikes so far this year, compared to $12 billion last year, according to electricity policy research group PowerLines. At the same time, new data centers are sprouting up everywhere as tech giants try to outpace each other — and their Chinese rivals — in the race to develop ever more advanced (and energy hungry) artificial intelligence systems, with hundreds of billions of dollars of new investments still in the pipeline.
You see the problem here?
In the PJM Interconnection, America’s largest electricity market which includes Virginia’s “data center alley” as part of its 13-state territory, some 30 gigawatts of a projected 32 total gigawatts of load growth through 2030 are expected to come from data centers.
“The onrush of demand has created significant upward pricing pressure and has raised future resource adequacy concerns,” David Mills, the chair of PJM’s board of managers, said in a letter last week announcing the beginning of a process to look into the issues raised by large load interconnection — i.e. getting data centers on the grid without exploding costs for other users of the grid or risking blackouts.
Customers in PJM are paying the price already, as increasingly scarce capacity has translated into upward-spiraling payments to generators, which then show up on retail electricity bills. New large loads can raise costs still further by requiring grid upgrades to accommodate the increased demand for power — costs that get passed down to all ratepayers. PJM alone has announced over $10 billion in transmission upgrades, according to research by Johns Hopkins scholar Abraham Silverman. “These new costs are putting significant upward pressure on customer bills,” Silverman wrote in a report with colleagues Suzanne Glatz and Mahala Lahvis, released in June.
“There’s increasing recognition that the path we’re on right now is not long-term sustainable,” Silverman told me when we spoke this week about the report. “Costs are increasing too fast. The amount of infrastructure we need to build is too much. We need to prioritize, and we need to make this data center expansion affordable for consumers. Right now it’s simply not. You can’t have multi-billion-dollar rate increases year over year.”
While it’s not clear precisely what role existing data center construction has played in electricity bill increases on a nationwide scale, rising electricity rates will likely become a political problem wherever and whenever they do hit, with data centers being the most visible manifestation of the pressures on the grid.
Charles Hua, the founder and executive director of PowerLines, called data centers “arguably the most important topic in energy,” but cautioned that outside of specific demonstrable instances (e.g. in PJM), linking them to utility rate increases can be “a very oversimplified narrative.” The business model for vertically integrated utilities can incentivize them to over-invest in local transmission, Hua pointed out. And even without new data center construction, the necessity of replacing and updating an aging grid would remain.
Still, the connection between large new sources of demand and higher prices is pretty easy to draw: Electricity grids are built to accommodate peak demand, while the bills customers receive are based on a combination of the fixed cost of maintaining the grid for everyone and the cost of the energy itself, therefore higher peak demand and more grid maintenance equals higher bills.
But what if data centers could use the existing transmission and generation system and not add to peak generation? That’s the promise of load flexibility.
If data centers could commit to not requiring power at times of extremely high demand, they could essentially piggyback on existing grid infrastructure. Widely cited research by Tyler Norris, Tim Profeta, Dalia Patino-Echeverri, and Adam Cowie-Haskell of Duke University demonstrated that curtailing large loads for as little as 0.5% of their annual uptime (177 hours of curtailment annually on average, with curtailment typically lasting just over two hours) could allow almost 100 gigawatts of new demand to connect to the grid without requiring extensive, costly upgrades.
The groundswell behind flexibility has rapidly gained institutional credibility. Last week, Google announced that it had reached deals with two utilities, Indiana Michigan Power and the Tennessee Valley Authority, to incorporate flexibility into how their data centers run. The Indiana Michigan Power contract will “allow [Google] to reduce or shift electricity demand to carry out non-urgent tasks during hours when the electric grid is under less stress,” the utility said.
Google has long been an innovator in energy procurement — it famously pioneered the power purchase agreement structure that has helped finance many a renewable energy development — and already has its fingers in many pots when it comes to grid flexibility. The company’s chief scientist, Jeff Dean, is an investor in Emerald AI, a software company that promises to help data centers work flexibly, while its urbanism-focused spinout Sidewalk Infrastructure Partners has backed Verrus, a demand-flexible data center developer.
Hyperscale developers aren’t the only big fish excited about data center flexibility. Financiers are, as well.
Goldman Sachs released a splashy report this week that cited Norris extensively (plus Heatmap). Data center flexibility promises to be a win-win-win, according to Goldman (which, of course, would love to finance an AI boom unhindered by higher retail electricity rates or long interconnection queues for new generation). “What if, thanks to curtailment, instead of overwhelming the grid, AI data centers became the shock absorbers that finally unlocked this stranded capacity?” the report asks.
The holy grail for developers and flexibility is not just saving money on electricity, which is a small cost compared to procuring advanced chips to train and run AI models. The real win would be to build new data centers faster. “Time to market is critical for AI companies,” the Goldman analysts wrote.
But creating a system where data centers can connect to the grid sooner if they promise to be flexible about power consumption would require immense institutional change for states, utilities, regulators, and power markets.
“We really don’t have existing service tiers in place for most jurisdictions that acknowledges and incentivizes flexible loads and plans around them,” Norris told me.
When I talked to Silverman, he told me that integrating flexibility into local decision-making could mean rewriting state utility regulations to allow a special pathway for data centers. It could also involve making local or state tax incentives contingent on flexibility.
Whatever the new structure looks like, the point is to “enshrine a policy that says, ‘data centers are different,’ and we are going to explicitly recognize those differences and tailor rules to data centers,” Silverman said. He pointed specifically to a piece of legislation in New Jersey that he consulted on, which would have utilities and regulators work together to come up with specific rate structures for data centers.
Norris also pointed to a proposal in the Southwest Power Pool, which runs down the spine of the country from the Dakotas to Louisiana, which would allow large loads like data centers to connect to the grid quickly “with the tradeoff of potential curtailment during periods of system stress to protect regional reliability,” the transmission organization said.
And there’s still more legal and regulatory work to be done before hyperscalers can take full advantage of those incentives, Norris told me. Utilities and their data center customers would have to come up with a rate structure that incorporates flexibility and faster interconnection, where more flexibility can allow for quicker timelines.
Speed is of the essence — not just to be able to link up more data centers, but also to avoid a political firestorm around rising electricity rates. There’s already a data center backlash brewing: The city of Tucson earlier this month rejected an Amazon facility in a unanimous city council vote, taken in front of a raucous, cheering crowd. Communities in Indiana, a popular location for data center construction, have rejected several projects.
The drama around PJM may be a test case for the rest of the country. After its 2024 capacity auction jumped came in at $15 billion, up from just over $2 billion the year before, complaints from Pennsylvania Governor Josh Shapiro led to a price cap on future auctions. PJM’s chief executive said in April that he would resign by the end of this year. A few months later, PJM’s next capacity auction hit the price cap.
“You had every major publication writing that AI data centers are causing electricity prices to spike” after the PJM capacity auction, Norris told me. “They lost that public relations battle.”
With more flexibility, there’s a chance for data center developers to tell a more positive story about how they affect the grid.
“It’s not just about avoiding additional costs,” Norris said. “There’s this opportunity that if you can mitigate additional cost, you can put downward cost on rates.” That’s almost putting things generously — data center developers might not have a choice.
On a billion-dollar mineral push, the north’s grim milestones, and EV charging’s comeback
Current conditions: The Southeastern U.S. is facing flash floods through the end of Thursday • Temperatures in Fez, Morocco, are forecast to hit 108 degrees Fahrenheit • Wildfires continue to rage across southern Europe, sending what Spain’s environment minister called a “clear warning” of the effects of climate change.
President Donald Trump on Wednesday named David Rosner, a centrist Democrat, as the new chairman of the Federal Energy Regulatory Commission. Since joining the commission in June 2024, Rosner focused the panel on the nation’s growing electricity demand from data centers and pushed for greater automation of the engineering process to connect power plants to the continent’s various grid systems. “Getting grid interconnection moving faster is essential to ensuring reliability,” Rosner told E&E News in March. “We’re starting to learn about these new tools and platforms that just make this work faster, smarter, saves us time, solves the reliability and affordability problems that are facing the country.”
The Bipartisan Policy Center, where Rosner previously worked as a staffer, hailed his promotion as a positive step. “Chairman Rosner’s strong understanding of the energy challenges facing our country, and demonstrated record of bipartisan work to address those challenges, make him well-suited to carry out the responsibilities of FERC chairman,” David R. Hill, the executive vice president of the group’s energy program, said in a statement.
Lithium production in Chile's Atacama Desert, one of the world's main sources.John Moore/Getty Images
The Energy Department announced at least $925 million in funding for five proposed programs to bolster the country’s domestic supply of minerals. “For too long, the United States has relied on foreign actors to supply and process the critical materials that are essential to modern life and our national security,” Secretary of Energy Chris Wright said in a press release. “The Energy Department will play a leading role in reshoring the processing of critical materials and expanding our domestic supply of these indispensable resources.”
That funding includes:
The Trump administration has made bolstering America’s critical minerals industry one of its signature energy policy priorities. Though as Heatmap’s Matthew Zeitlin has written, it has also gone out of its way to annihilate sources of domestic demand for these minerals, especially in the wind energy and electric vehicle industries.
In Alaska, an overflowing glacial lake north of Juneau triggered the Mendenhall River to surge to a record height, flooding the state’s capital city. The problem has been growing for years as climate change in the nation’s most rapidly-warming state accelerates the volume of ice melt. In 2023, floodwaters eroded Mendenhall’s banks, causing homes to collapse, according to the Alaska Beacon. In 2024, the news outlet reported, “the flood was the worst yet.” The flood peaked Wednesday afternoon at nearly 17 feet, damaging hundreds of homes.
Across the border, meanwhile, the more than 700 active fires blazing in Canada have already made this the country’s second-worst fire season on record. The largest fire, the Shoe fire in Saskatchewan, has been burning across 1.4 million acres — an area larger than the Grand Canyon National Park in Arizona — since May 7, The New York Times reported.
In an executive order on his first day back in office, Trump singled out the $5 billion National Electric Vehicle Infrastructure program, directing his Department of Transportation to pause and review the funding, with an eye toward cutting it entirely. Earlier this week, the Federal High Administration completed its review and issued a new guidance that, as my colleague Emily Pontecorvo wrote yesterday, “not only preserves it, but also purports to ‘streamline applications,’ ‘slash red tape,’ and ‘ensure charging stations are actually built.’”
“If Congress is requiring the federal government to support charging stations, let’s cut the waste and do it right,” Transportation Secretary Sean Duffy said in a press release. “While I don’t agree with subsidizing green energy, we will respect Congress’ will and make sure this program uses federal resources efficiently.” The statement, Emily noted, is out of sync with the administration’s other actions to throttle the adoption of EVs: “Only time will tell whether the new guidance is truly a win for EV charging, however. It’s a win in the sense that many EV advocates feared the agency would try to kill the program or insert poison pills into the guidance. But it’s unclear whether the changes will speed up NEVI deployment beyond what might have happened had it not been paused.”
A researcher has designed a new centimeter-square device that could help probe the “ignorosphere,” a layer of ultra-thin air that has largely escaped exploration by balloons, aircraft and satellites. The contraption uses technology similar to a weathervane encased in a low-pressure chamber that will spin when exposed to light. “You don’t really believe it until you see it,” Ben Schafer, a physicist at Harvard University in Cambridge, Massachusetts, told Nature.