You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
Is international cooperation or technological development the answer to an apocalyptic threat?

Christopher Nolan’s film Oppenheimer is about the great military contest of the Second World War, but only in the background. It’s really about a clash of visions for a postwar world defined by the physicist J. Robert Oppenheimer’s work at Los Alamos and beyond. The great power unleashed by the bombs at Hiroshima and Nagasaki could be dwarfed by what knowledge of nuclear physics could produce in the coming years, risking a war more horrifying than the one that had just concluded.
Oppenheimer, and many of his fellow atomic scientists, would spend much of the postwar period arguing for international cooperation, scientific openness, and nuclear restriction. But there was another cadre of scientists, exemplified by a former colleague turned rival, Edward Teller, that sought to answer the threat of nuclear annihilation with new technology — including even bigger bombs.
As the urgency of the nuclear question declined with the end of the Cold War, the scientific community took up a new threat to global civilization: climate change. While the conflict mapped out in Oppenheimer was over nuclear weapons, the clash of visions, which ended up burying Oppenheimer and elevating Teller, also maps out to the great debate over global warming: Should we reach international agreements to cooperatively reduce carbon emissions or should we throw our — and specifically America’s — great resources into a headlong rush of technological development? Should we massively overhaul our energy system or make the sun a little less bright?
Oppenheimer’s dream of international cooperation to prevent a nuclear arms race was born even before the Manhattan Project culminated with the Trinity test. Oppenheimer and Danish physicist Niels Bohr “believed that an agreement between the wartime allies based upon the sharing of information, including the existence of the Manhattan Project, could prevent the surfacing of a nuclear-armed world,” writes Marco Borghi in a Wilson Institute working paper.
Oppenheimer even suggested that the Soviets be informed of the Manhattan Project’s efforts and, according to Martin Sherwin and Kai Bird’s American Prometheus, had “assumed that such forthright discussions were taking place at that very moment” at the conference in Potsdam where, Oppenheimer “was later appalled to learn” that Harry Truman had only vaguely mentioned the bomb to Joseph Stalin, scotching the first opportunity for international nuclear cooperation.
Oppenheimer continued to take up the cause of international cooperation, working as the lead advisor for Dean Acheson and David Lilienthal on their 1946 nuclear control proposal, which would never get accepted by the United Nations and, namely, the Soviet Union after it was amended by Truman’s appointed U.N. representative Bernard Baruch to be more favorable to the United States.
In view of the next 50 years of nuclear history — further proliferation, the development of thermonuclear weapons that could be mounted on missiles that were likely impossible to shoot down — the proposals Oppenheimer developed seem utopian: The U.N. would "bring under its complete control world supplies of uranium and thorium," including all mining, and would control all nuclear reactors. This scheme would also make the construction of new weapons impossible, lest other nations build their own.
By the end of 1946, the Baruch proposal had died along with any prospect of international control of nuclear power, all the while the Soviets were working intensely to disrupt America’s nuclear monopoly — with the help of information ferried out of Los Alamos — by successfully testing a weapon before the end of the decade.
With the failure of international arms control and the beginning of the arms race, Oppenheimer’s vision of a post-Trinity world would come to shambles. For Teller, however, it was a great opportunity.
While Oppenheimer planned to stave off nuclear annihilation through international cooperation, Teller was trying to build a bigger deterrent.
Since the early stages of the Manhattan Project, Teller had been dreaming of a fusion weapon many times more powerful than the first atomic bombs, what was then called the “Super.” When the atomic bomb was completed, he would again push for the creation of a thermonuclear bomb, but the efforts stalled thanks to technical and theoretical issues with Teller’s proposed design.
Nolan captures Teller’s early comprehension of just how powerful nuclear weapons can be. In a scene that’s pulled straight from accounts of the Trinity blast, most of the scientists who view the test are either in bunkers wearing welding goggles or following instructions to lie down, facing away from the blast. Not so for Teller. He lathers sunscreen on his face, straps on a pair of dark goggles, and views the explosion straight on, even pursing his lips as the explosion lights up the desert night brighter than the sun.
And it was that power — the sun’s — that Teller wanted to harness in pursuit of his “Super,” where a bomb’s power would be derived from fusing together hydrogen atoms, creating helium — and a great deal of energy. It would even use a fission bomb to help ignite the process.
Oppenheimer and several scientific luminaries, including Manhattan Project scientists Enrico Fermi and Isidor Rabi, opposed the bomb, issuing in their official report on their positions advising the Atomic Energy Commission in 1949 statements that the hydrogen bomb was infeasible, strategically useless, and potentially a weapon of “genocide.”
But by 1950, thanks in part to Teller and the advocacy of Lewis Strauss, a financier turned government official and the approximate villain of Nolan’s film, Harry Truman would sign off on a hydrogen bomb project, resulting in the 1952 “Ivy Mike” test where a bomb using a design from Teller and mathematician Stan Ulam would vaporize the Pacific Island Elugelab with a blast about 700 times more powerful than the one that destroyed Hiroshima.
The success of the project re-ignited doubts around Oppenheimer’s well-known left-wing political associations in the years before the war and, thanks to scheming by Strauss, he was denied a renewed security clearance.
While several Manhattan Project scientists testified on his behalf, Teller did not, saying, “I thoroughly disagreed with him in numerous issues and his actions frankly appeared to me confused and complicated.”
It was the end of Oppenheimer’s public career. The New Deal Democrat had been eclipsed by Teller, who would become the scientific avatar of the Reagan Republicans.
For the next few decades, Teller would stay close to politicians, the military, and the media, exercising a great deal of influence over arms policy for several decades from the Lawrence Livermore National Laboratory, which he helped found, and his academic perch at the University of California.
He pooh-poohed the dangers of radiation, supported the building of more and bigger bombs that could be delivered by longer and longer range missiles, and opposed prohibitions on testing. When Dwight Eisenhower was considering a negotiated nuclear test ban, Teller faced off against future Nobel laureate and Manhattan Project alumnus Hans Bethe over whether nuclear tests could be hidden from detection by conducting them underground in a massive hole; the eventual 1963 test ban treaty would exempt underground testing.
As the Cold War settled into a nuclear standoff with both the United States and the Soviet Union possessing enough missiles and nuclear weapons to wipe out the other, Teller didn’t look to treaties, limitations, and cooperation to solve the problem of nuclear brinksmanship, but instead to space: He wanted to neutralize the threat of a Soviet first strike using x-ray lasers from space powered by nuclear explosions (he was again opposed by Bethe and the x-ray lasers never came to fruition).
He also notoriously dreamed up Project Plowshare, the civilian nuclear project which would get close to nuking out a new harbor in Northern Alaska and actually did attempt to extract gas in New Mexico and Colorado using nuclear explosions.
Yet, in perhaps the strangest turn of all, Teller also became something of a key figure in the history of climate change research, both in his relatively early awareness of the problem and the conceptual gigantism he brought to proposing to solve it.
While publicly skeptical of climate change later in his life, Teller was starting to think about climate change, decades before James Hansen’s seminal 1988 Congressional testimony.
The researcher and climate litigator Benajmin Franta made the startling archival discovery that Teller had given a speech at an oil industry event in 1959 where he warned “energy resources will run short as we use more and more of the fossil fuels,” and, after explaining the greenhouse effect, he said that “it has been calculated that a temperature rise corresponding to a 10 percent increase in carbon dioxide will be sufficient to melt the icecap and submerge New York … I think that this chemical contamination is more serious than most people tend to believe.”
Teller was also engaged with issues around energy and other “peaceful” uses of nuclear power. In response to concerns about the dangers of nuclear reactors, he in the 1960s began advocating putting them underground, and by the early 1990s proposed running said underground nuclear reactors automatically in order to avoid the human error he blamed for the disasters at Chernobyl and Three Mile Island.
While Teller was always happy to find some collaborators to almost throw off an ingenious-if-extreme solution to a problem, there is a strain of “Tellerism,” both institutionally and conceptually, that persists to this day in climate science and energy policy.
Nuclear science and climate science had long been intertwined, Stanford historian Paul Edwards writes, including that the “earliest global climate models relied on numerical methods very similar to those developed by nuclear weapons designers for solving the fluid dynamics equations needed to analyze shock waves produced in nuclear explosions.”
Where Teller comes in is in the role that Lawrence Livermore played in both its energy research and climate modeling. “With the Cold War over and research on nuclear weapons in decline, the national laboratories faced a quandary: What would justify their continued existence?” Edwards writes. The answer in many cases would be climate change, due to these labs’ ample collection of computing power, “expertise in numerical modeling of fluid dynamics, and their skills in managing very large data sets.”
One of those labs was Livermore, the institution founded by Teller, a leading center of climate and energy modeling and research since the late 1980s. “[Teller] was very enthusiastic about weather control,” early climate modeler Cecil “Chuck” Leith told Edwards in an oral history.
The Department of Energy writ large, which inherited much of the responsibilities of the Atomic Energy Commission, is now one of the lead agencies on climate change policy and energy research.
Which brings us to fusion.
It was Teller’s Lawrence Livermore National Laboratory that earlier this year successfully got more power out of a controlled fusion reaction than it put in — and it was Energy Secretary Jennifer Granholm who announced it, calling it the “holy grail” of clean energy development.
Teller’s journey with fusion is familiar to its history: early cautious optimism followed by a realization that it would likely not be achieved soon. As early as 1958, he said in a speech that he had been discussing “controlled fusion” at Los Alamos and that “thermonuclear energy generation is possible,” although he admitted that “the problem is not quite easy” and by 1987 had given up on seeing it realized during his lifetime.
Still, what controlled fusion we do have at Livermore’s National Ignition Facility owes something to Teller and the technology he pioneered in the hydrogen bomb, according to physicist NJ Fisch.
While fusion is one infamous technological fix for the problem of clean and cheap energy production, Teller and the Livermore cadres were also a major influence on the development of solar geoengineering, the idea that global warming could be averted not by reducing the emissions of greenhouse gas into the atmosphere, but by making the sun less intense.
In a mildly trolling column for the Wall Street Journal in January 1998, Teller professed agnosticism on climate change (despite giving that speech to oil executives three decades prior) but proposed an alternative policy that would be “far less burdensome than even a system of market-allocated emissions permits”: solar geoengineering with “fine particles.”
The op-ed placed in the conservative pages of the Wall Street Journal was almost certainly an effort to oppose the recently signed Kyoto Protocol, but the ideas have persisted among thinkers and scientists whose engagement with environmental issues went far beyond their own opinion about Al Gore and by extension the environmental movement as a whole (Teller’s feelings about both were negative).
But his proposal would be familiar to the climate debates of today: particle emissions that would scatter sunlight and thus lower atmospheric temperatures. If climate change had to be addressed, Teller argued, “let us play to our uniquely American strengths in innovation and technology to offset any global warming by the least costly means possible.”
A paper he wrote with two colleagues that was an early call for spraying sulfates in the stratosphere also proposed “deploying electrically-conducting sheeting, either in the stratosphere or in low Earth orbit.” These were “literally diaphanous shattering screens,” that could scatter enough sunlight in order to reduce global warming — one calculation Teller made concludes that 46 million square miles, or about 1 percent of the surface area of the Earth, of these screens would be necessary.
The climate scientist and Livermore alumnus Ken Caldeira has attributed his own initial interest in solar geoengineering to Lowell Wood, a Livermore researcher and Teller protégé. While often seen as a centrist or even a right wing idea in order to avoid the more restrictionist policies on carbon emissions, solar geoengineering has sparked some interest on the left, including in socialist science fiction author Kim Stanley Robinson’s The Ministry for the Future, which envisions India unilaterally pumping sulfates into the atmosphere in response to a devastating heat wave.
The White House even quietly released a congressionally-mandated report on solar geoengineering earlier this spring, outlining avenues for further research.
While the more than 30 years since the creation of the Intergovernmental Panel on Climate Change and the beginnings of Kyoto Protocol have emphasized international cooperation on both science and policymaking through agreed upon goals in emissions reductions, the technological temptation is always present.
And here we can perhaps see that the split between the moralized scientists and their pleas for addressing the problems of the arms race through scientific openness and international cooperation and those of the hawkish technicians, who wanted to press the United States’ technical advantage in order to win the nuclear standoff and ultimately the Cold War through deterrence.
With the IPCC and the United Nations Climate Conference, through which emerged the Kyoto Protocol and the Paris Agreement, we see a version of what the postwar scientists wanted applied to the problem of climate change. Nations come together and agree on targets for controlling something that may benefit any one of them but risks global calamity. The process is informed by scientists working with substantial resources across national borders who play a major role in formulating and verifying the policy mechanisms used to achieve these goals.
But for almost as long as climate change has been an issue of international concern, the Tellerian path has been tempting. While Teller’s dreams of massive sun-scattering sheets, nuclear earth engineering, and automated underground reactors are unlikely to be realized soon, if at all, you can be sure there are scientists and engineers looking straight into the light. And they may one day drag us into it, whether we want to or not.
Editor’s note: An earlier version of this article misstated the name of a climate modeler. It’s been corrected. We regret the error.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
The proportion of voters who strongly oppose development grew by nearly 50%.
During his State of the Union address Tuesday night, President Donald Trump attempted to stanch the public’s bleeding support for building the data centers his administration says are necessary to beat China in the artificial intelligence race. With “many Americans” now “concerned that energy demand from AI data centers could unfairly drive up their electricity bills,” Trump said, he pledged to make major tech companies pay for new power plants to supply electricity to data centers.
New polling from energy intelligence platform Heatmap Pro shows just how dramatically and swiftly American voters are turning against data centers.
Earlier this month, the survey, conducted by Embold Research, reached out to 2,091 registered voters across the country, explaining that “data centers are facilities that house the servers that power the internet, apps, and artificial intelligence” and asking them, “Would you support or oppose a data center being built near where you live?” Just 28% said they would support or strongly support such a facility in their neighborhood, while 52% said they would oppose or strongly oppose it. That’s a net support of -24%.
When Heatmap Pro asked a national sample of voters the same question last fall, net support came out to +2%, with 44% in support and 42% opposed.
The steep drop highlights a phenomenon Heatmap’s Jael Holzman described last fall — that data centers are "swallowing American politics,” as she put it, uniting conservation-minded factions of the left with anti-renewables activists on the right in opposing a common enemy.
The results of this latest Heatmap Pro poll aren’t an outlier, either. Poll after poll shows surging public antipathy toward data centers as populists at both ends of the political spectrum stoke outrage over rising electricity prices and tech giants struggle to coalesce around a single explanation of their impacts on the grid.
“The hyperscalers have fumbled the comms game here,” Emmet Penney, an energy researcher and senior fellow at the right-leaning Foundation for American Innovation, told me.
A historian of the nuclear power sector, Penney sees parallels between the grassroots pushback to data centers and the 20th century movement to stymie construction of atomic power stations across the Western world. In both cases, opponents fixated on and popularized environmental criticisms that were ultimately deemed minor relative to the benefits of the technology — production of radioactive waste in the case of nuclear plants, and as seems increasingly clear, water usage in the case of data centers.
Likewise, opponents to nuclear power saw urgent efforts to build out the technology in the face of Cold War competition with the Soviet Union as more reason for skepticism about safety. Ditto the current rhetoric on China.
Penney said that both data centers and nuclear power stoke a “fear of bigness.”
“Data centers represent a loss of control over everyday life because artificial intelligence means change,” he said. “The same is true about nuclear,” which reached its peak of expansion right as electric appliances such as dishwashers and washing machines were revolutionizing domestic life in American households.
One of the more fascinating findings of the Heatmap Pro poll is a stark urban-rural divide within the Republican Party. Net support for data centers among GOP voters who live in suburbs or cities came out to -8%. Opposition among rural Republicans was twice as deep, at -20%. While rural Democrats and independents showed more skepticism of data centers than their urbanite fellow partisans, the gap was far smaller.
That could represent a challenge for the Trump administration.
“People in the city are used to a certain level of dynamism baked into their lives just by sheer population density,” Penney said. “If you’re in a rural place, any change stands out.”
Senator Bernie Sanders, the democratic socialist from Vermont, has championed legislation to place a temporary ban on new data centers. Such a move would not be without precedent; Ireland, transformed by tax-haven policies over the past two decades into a hub for Silicon Valley’s giants, only just ended its de facto three-year moratorium on hooking up data centers to the grid.
Senator Josh Hawley, the Missouri Republican firebrand, proposed his own bill that would force data centers off the grid by requiring the complexes to build their own power plants, much as Trump is now promoting.
On the opposite end of the spectrum, you have Republicans such as Mississippi Governor Tate Reeves, who on Tuesday compared halting construction of data centers to “civilizational suicide.”
“I am tempted to sit back and let other states fritter away the generational chance to build. To laugh at their short-sightedness,” he wrote in a post on X. “But the best path for all of us would be to see America dominate, because our foes are not like us. They don’t believe in order, except brutal order under their heels. They don’t believe in prosperity, except for that gained through fraud and plunder. They don’t think or act in a way I can respect as an American.”
Then you have the actual hyperscalers taking opposite tacks. Amazon Web Services, for example, is playing offense, promoting research that shows its data centers are not increasing electricity rates. Claude-maker Anthropic, meanwhile, issued a de facto mea culpa, pledging earlier this month to offset all its electricity use.
Amid that scattershot messaging, the critical rhetoric appears to be striking its targets. Whether Trump’s efforts to curb data centers’ impact on the grid or Reeves’ stirring call to patriotic sacrifice can reverse cratering support for the buildout remains to be seen. The clock is ticking. There are just 36 weeks until the midterm Election Day.
The public-private project aims to help realize the president’s goal of building 10 new reactors by 2030.
The Department of Energy and the Westinghouse Electric Company have begun meeting with utilities and nuclear developers as part of a new project aimed at spurring the country’s largest buildout of new nuclear power plants in more than 30 years, according to two people who have been briefed on the plans.
The discussions suggest that the Trump administration’s ambitious plans to build a fleet of new nuclear reactors are moving forward at least in part through the Energy Department. President Trump set a goal last year of placing 10 new reactors under construction nationwide by 2030.
The project aims to purchase the parts for 8 gigawatts to 10 gigawatts of new nuclear reactors, the people said. The reactors would almost certainly be AP1000s, a third-generation reactor produced by Westinghouse capable of producing up to 1.1 gigawatts of electricity per unit.
The AP1000 is the only third-generation reactor successfully deployed in the United States. Two AP1000 reactors were completed — and powered on — at Plant Vogtle in eastern Georgia earlier this decade. Fifteen other units are operating or under construction worldwide.
Representatives from Westinghouse and the Energy Department did not respond to requests for comment.
The project would use government and private financing to buy advanced reactor equipment that requires particularly long lead times, the people said. It would seek to lower the cost of the reactors by placing what would essentially be a single bulk order for some of their parts, allowing Westinghouse to invest in and scale its production efforts. It could also speed up construction timelines for the plants themselves.
The department is in talks with four to five potential partners, including utilities, independent power producers, and nuclear development companies, about joining the project. Under the plan, these utilities or developers would agree to purchase parts for two new reactors each. The program would be handled in part by the department’s in-house bank, the Loan Programs Office, which the Trump administration has dubbed the Office of Energy Dominance Financing.
This fleet-based approach to nuclear construction has succeeded in the past. After the oil crisis struck France in the 1970s, the national government responded by planning more than three-dozen reactors in roughly a decade, allowing the country to build them quickly and at low cost. France still has some of the world’s lowest-carbon electricity.
By comparison, the United States has built three new nuclear reactors, totaling roughly 3.5 gigawatts of capacity, since the year 2000, and it has not significantly expanded its nuclear fleet since 1990. The Trump administration set a goal in May to quadruple total nuclear energy production — which stands at roughly 100 gigawatts today — to more than 400 gigawatts by the middle of the century.
The Trump administration and congressional Republicans have periodically announced plans to expand the nuclear fleet over the past year, although details on its projects have been scant.
Senator Dave McCormick, a Republican of Pennsylvania, announced at an energy summit last July that Westinghouse was moving forward with plans to build 10 new reactors nationwide by 2030.
In October, Commerce Secretary Howard Lutnick announced a new deal between the U.S. government, the private equity firm Brookfield Asset Management, and the uranium company Cameco to deploy $80 billion in new Westinghouse reactors across the United States. (A Brookfield subsidiary and Cameco have jointly owned Westinghouse since it went bankrupt in 2017 due to construction cost overruns.) Reuters reported last month that this deal aimed to satisfy the Trump administration’s 2030 goal.
While there have been other Republican attempts to expand the nuclear fleet over the years, rising electricity demand and the boom in artificial intelligence data centers have brought new focus to the issue. This time, Democratic politicians have announced their own plans to boost nuclear power in their states.
In January, New York Governor Kathy Hochul set a goal of building 4 gigawatts of new nuclear power plants in the Empire State.
In his State of the State address, Governor JB Pritzker of Illinois told lawmakers last week that he hopes to see at least 2 gigawatts of new nuclear power capacity operating in his state by 2033.
Meeting Trump’s nuclear ambitions has been a source of contention between federal agencies. Politico reported on Thursday that the Energy Department had spent months negotiating a nuclear strategy with Westinghouse last year when Lutnick inserted himself directly into negotiations with the company. Soon after, the Commerce Department issued an announcement for the $80 billion megadeal, which was big on hype but short on details.
The announcement threw a wrench in the Energy Department’s plans, but the agency now seems to have returned to the table. According to Politico, it is now also “engaging” with GE Hitachi, another provider of advanced nuclear reactors.
On nuclear tax credits, BLM controversy, and a fusion maverick’s fundraise
Current conditions: A third storm could dust New York City and the surrounding area with more snow • Floods and landslides have killed at least 25 people in Brazil’s southeastern state of Minas Gerais • A heat dome in Western Europe is pushing up temperatures in parts of Portugal, Spain, and France as high as 15 degrees Celsius above average.

The Department of Energy’s in-house lender, the Loan Programs Office — dubbed the Office of Energy Dominance Financing by the Trump administration — just gave out the largest loan in its history to Southern Company. The nearly $27 billion loan will “build or upgrade over 16 gigawatts of firm reliable power,” including 5 gigawatts of new gas generation, 6 gigawatts of uprates and license renewals for six different reactors, and more than 1,300 miles of transmission and grid enhancement projects. In total, the package will “deliver $7 billion in electricity cost savings” to millions of ratepayers in Georgia and Alabama by reducing the utility giant’s interest expenses by over $300 million per year. “These loans will not only lower energy costs but also create thousands of jobs and increase grid reliability for the people of Georgia and Alabama,” Secretary of Energy Chris Wright said in a statement.
Over in Utah, meanwhile, the state government is seeking the authority to speed up its own deployment of nuclear reactors as electricity demand surges in the desert state. In a letter to the Nuclear Regulatory Commission dated November 10 — but which E&E News published this week — Tim Davis, the executive director of Utah’s Department of Environmental Quality, requested that the federal agency consider granting the state the power to oversee uranium enrichment, microreactor licensing, fuel storage, and reprocessing on its own. All of those sectors fall under the NRC’s exclusive purview. At least one program at the NRC grants states limited regulatory primacy for some low-level radiological material. While there’s no precedent for a transfer of power as significant as what Utah is requesting, the current administration is upending norms at the NRC more than any other government since the agency’s founding in 1975.
Building a new nuclear plant on a previously undeveloped site is already a steep challenge in electricity markets such as New York, California, or the Midwest, which broke up monopoly utilities in the 1990s and created competitive auctions that make decade-long, multibillion-dollar reactors all but impossible to finance. A growing chorus argues, as Heatmap’s Matthew Zeitlin wrote, that these markets “are no longer working.” Even in markets with vertically-integrated power companies, the federal tax credits meant to spur construction of new reactors would make financing a greenfield plant is just as impossible, despite federal tax credits meant to spur construction of new reactors. That’s the conclusion of a new analysis by a trio of government finance researchers at the Center for Public Enterprise. The investment tax credit, “large as it is, cannot easily provide them with upfront construction-period support,” the report found. “The ITC is essential to nuclear project economics, but monetizing it during construction poses distinct challenges for nuclear developers that do not arise for renewable energy projects. Absent a public agency’s ability to leverage access to the elective payment of tax credits, it is challenging to see a path forward for attracting sufficient risk capital for a new nuclear project under the current circumstances.”
Steve Pearce, Trump’s pick to lead the Department of the Interior’s Bureau of Land Management, wavered when asked about his record of pushing to sell off federal lands during his nomination hearing Wednesday. A former Republican lawmaker from New Mexico, Pearce has faced what the public lands news site Public Domain called “broad backlash from environmental, conservation, and hunting groups for his record of working to undermine public land protections and push land sales as a way to reduce the federal deficit.” Faced with questions from Democratic senators, Pearce said, “I’m not so sure that I’ve changed,” but insisted he didn’t “believe that we’re going to go out and wholesale land from the federal government.” That has, however, been the plan since the start of the administration. As Heatmap’s Jeva Lange wrote last year, Republicans looked poised to use their trifecta to sell off some of the approximately 640 million acres of land the federal government owns.
Sign up to receive Heatmap AM in your inbox every morning:
At Tuesday’s State of the Union address, as I told you yesterday, Trump vowed to force major data center companies to build, bring, or buy their own power plants to keep the artificial intelligence boom from driving up electricity prices. On Wednesday, Fox News reported that Amazon, Google, Meta, Microsoft, xAI, Oracle, and OpenAI planned to come to the White House to sign onto the deal. The meeting is set to take place sometime next month. Data centers are facing mounting backlash. Developers abandoned at least 25 data centers last year amid mounting pushback from local opponents, Heatmap's Robinson Meyer recently reported.
Shine Technologies is a rare fusion company that’s actually making money today. That’s because the Wisconsin-based firm uses its plasma beam fusion technology to produce isotopes for testing and medical therapies. Next, the company plans to start recycling nuclear waste for fresh reactor fuel. To get there, Shine Technologies has raised $240 million to fund its efforts for the next few years, as I reported this morning in an exclusive for Heatmap. Nearly 63% of the funding came from biotech billionaire Patrick Soon-Shiong, who will join the board. The capital will carry the company through the launch of the world’s largest medical isotope producer and lay the foundations of a new business recycling nuclear waste in the early 2030s that essentially just reorders its existing assembly line.
Vineyard Wind is nearly complete. As of Wednesday, 60 of the project’s 62 turbines have been installed off the coast of Massachusetts. Of those, E&E News reported, 52 have been cleared to start producing power. The developer Iberdrola said the final two turbines may be installed in the next few days. “For me, as an engineer, the farm is already completed,” Iberdrola’s executive chair, Ignacio Sánchez Galán, told analysts on an earnings call. “I think these numbers mean the level of availability is similar for other offshore wind farms we have in operation. So for me, that is completed.”