You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
Is international cooperation or technological development the answer to an apocalyptic threat?
Christopher Nolan’s film Oppenheimer is about the great military contest of the Second World War, but only in the background. It’s really about a clash of visions for a postwar world defined by the physicist J. Robert Oppenheimer’s work at Los Alamos and beyond. The great power unleashed by the bombs at Hiroshima and Nagasaki could be dwarfed by what knowledge of nuclear physics could produce in the coming years, risking a war more horrifying than the one that had just concluded.
Oppenheimer, and many of his fellow atomic scientists, would spend much of the postwar period arguing for international cooperation, scientific openness, and nuclear restriction. But there was another cadre of scientists, exemplified by a former colleague turned rival, Edward Teller, that sought to answer the threat of nuclear annihilation with new technology — including even bigger bombs.
As the urgency of the nuclear question declined with the end of the Cold War, the scientific community took up a new threat to global civilization: climate change. While the conflict mapped out in Oppenheimer was over nuclear weapons, the clash of visions, which ended up burying Oppenheimer and elevating Teller, also maps out to the great debate over global warming: Should we reach international agreements to cooperatively reduce carbon emissions or should we throw our — and specifically America’s — great resources into a headlong rush of technological development? Should we massively overhaul our energy system or make the sun a little less bright?
Oppenheimer’s dream of international cooperation to prevent a nuclear arms race was born even before the Manhattan Project culminated with the Trinity test. Oppenheimer and Danish physicist Niels Bohr “believed that an agreement between the wartime allies based upon the sharing of information, including the existence of the Manhattan Project, could prevent the surfacing of a nuclear-armed world,” writes Marco Borghi in a Wilson Institute working paper.
Oppenheimer even suggested that the Soviets be informed of the Manhattan Project’s efforts and, according to Martin Sherwin and Kai Bird’s American Prometheus, had “assumed that such forthright discussions were taking place at that very moment” at the conference in Potsdam where, Oppenheimer “was later appalled to learn” that Harry Truman had only vaguely mentioned the bomb to Joseph Stalin, scotching the first opportunity for international nuclear cooperation.
Oppenheimer continued to take up the cause of international cooperation, working as the lead advisor for Dean Acheson and David Lilienthal on their 1946 nuclear control proposal, which would never get accepted by the United Nations and, namely, the Soviet Union after it was amended by Truman’s appointed U.N. representative Bernard Baruch to be more favorable to the United States.
In view of the next 50 years of nuclear history — further proliferation, the development of thermonuclear weapons that could be mounted on missiles that were likely impossible to shoot down — the proposals Oppenheimer developed seem utopian: The U.N. would "bring under its complete control world supplies of uranium and thorium," including all mining, and would control all nuclear reactors. This scheme would also make the construction of new weapons impossible, lest other nations build their own.
By the end of 1946, the Baruch proposal had died along with any prospect of international control of nuclear power, all the while the Soviets were working intensely to disrupt America’s nuclear monopoly — with the help of information ferried out of Los Alamos — by successfully testing a weapon before the end of the decade.
With the failure of international arms control and the beginning of the arms race, Oppenheimer’s vision of a post-Trinity world would come to shambles. For Teller, however, it was a great opportunity.
While Oppenheimer planned to stave off nuclear annihilation through international cooperation, Teller was trying to build a bigger deterrent.
Since the early stages of the Manhattan Project, Teller had been dreaming of a fusion weapon many times more powerful than the first atomic bombs, what was then called the “Super.” When the atomic bomb was completed, he would again push for the creation of a thermonuclear bomb, but the efforts stalled thanks to technical and theoretical issues with Teller’s proposed design.
Nolan captures Teller’s early comprehension of just how powerful nuclear weapons can be. In a scene that’s pulled straight from accounts of the Trinity blast, most of the scientists who view the test are either in bunkers wearing welding goggles or following instructions to lie down, facing away from the blast. Not so for Teller. He lathers sunscreen on his face, straps on a pair of dark goggles, and views the explosion straight on, even pursing his lips as the explosion lights up the desert night brighter than the sun.
And it was that power — the sun’s — that Teller wanted to harness in pursuit of his “Super,” where a bomb’s power would be derived from fusing together hydrogen atoms, creating helium — and a great deal of energy. It would even use a fission bomb to help ignite the process.
Oppenheimer and several scientific luminaries, including Manhattan Project scientists Enrico Fermi and Isidor Rabi, opposed the bomb, issuing in their official report on their positions advising the Atomic Energy Commission in 1949 statements that the hydrogen bomb was infeasible, strategically useless, and potentially a weapon of “genocide.”
But by 1950, thanks in part to Teller and the advocacy of Lewis Strauss, a financier turned government official and the approximate villain of Nolan’s film, Harry Truman would sign off on a hydrogen bomb project, resulting in the 1952 “Ivy Mike” test where a bomb using a design from Teller and mathematician Stan Ulam would vaporize the Pacific Island Elugelab with a blast about 700 times more powerful than the one that destroyed Hiroshima.
The success of the project re-ignited doubts around Oppenheimer’s well-known left-wing political associations in the years before the war and, thanks to scheming by Strauss, he was denied a renewed security clearance.
While several Manhattan Project scientists testified on his behalf, Teller did not, saying, “I thoroughly disagreed with him in numerous issues and his actions frankly appeared to me confused and complicated.”
It was the end of Oppenheimer’s public career. The New Deal Democrat had been eclipsed by Teller, who would become the scientific avatar of the Reagan Republicans.
For the next few decades, Teller would stay close to politicians, the military, and the media, exercising a great deal of influence over arms policy for several decades from the Lawrence Livermore National Laboratory, which he helped found, and his academic perch at the University of California.
He pooh-poohed the dangers of radiation, supported the building of more and bigger bombs that could be delivered by longer and longer range missiles, and opposed prohibitions on testing. When Dwight Eisenhower was considering a negotiated nuclear test ban, Teller faced off against future Nobel laureate and Manhattan Project alumnus Hans Bethe over whether nuclear tests could be hidden from detection by conducting them underground in a massive hole; the eventual 1963 test ban treaty would exempt underground testing.
As the Cold War settled into a nuclear standoff with both the United States and the Soviet Union possessing enough missiles and nuclear weapons to wipe out the other, Teller didn’t look to treaties, limitations, and cooperation to solve the problem of nuclear brinksmanship, but instead to space: He wanted to neutralize the threat of a Soviet first strike using x-ray lasers from space powered by nuclear explosions (he was again opposed by Bethe and the x-ray lasers never came to fruition).
He also notoriously dreamed up Project Plowshare, the civilian nuclear project which would get close to nuking out a new harbor in Northern Alaska and actually did attempt to extract gas in New Mexico and Colorado using nuclear explosions.
Yet, in perhaps the strangest turn of all, Teller also became something of a key figure in the history of climate change research, both in his relatively early awareness of the problem and the conceptual gigantism he brought to proposing to solve it.
While publicly skeptical of climate change later in his life, Teller was starting to think about climate change, decades before James Hansen’s seminal 1988 Congressional testimony.
The researcher and climate litigator Benajmin Franta made the startling archival discovery that Teller had given a speech at an oil industry event in 1959 where he warned “energy resources will run short as we use more and more of the fossil fuels,” and, after explaining the greenhouse effect, he said that “it has been calculated that a temperature rise corresponding to a 10 percent increase in carbon dioxide will be sufficient to melt the icecap and submerge New York … I think that this chemical contamination is more serious than most people tend to believe.”
Teller was also engaged with issues around energy and other “peaceful” uses of nuclear power. In response to concerns about the dangers of nuclear reactors, he in the 1960s began advocating putting them underground, and by the early 1990s proposed running said underground nuclear reactors automatically in order to avoid the human error he blamed for the disasters at Chernobyl and Three Mile Island.
While Teller was always happy to find some collaborators to almost throw off an ingenious-if-extreme solution to a problem, there is a strain of “Tellerism,” both institutionally and conceptually, that persists to this day in climate science and energy policy.
Nuclear science and climate science had long been intertwined, Stanford historian Paul Edwards writes, including that the “earliest global climate models relied on numerical methods very similar to those developed by nuclear weapons designers for solving the fluid dynamics equations needed to analyze shock waves produced in nuclear explosions.”
Where Teller comes in is in the role that Lawrence Livermore played in both its energy research and climate modeling. “With the Cold War over and research on nuclear weapons in decline, the national laboratories faced a quandary: What would justify their continued existence?” Edwards writes. The answer in many cases would be climate change, due to these labs’ ample collection of computing power, “expertise in numerical modeling of fluid dynamics, and their skills in managing very large data sets.”
One of those labs was Livermore, the institution founded by Teller, a leading center of climate and energy modeling and research since the late 1980s. “[Teller] was very enthusiastic about weather control,” early climate modeler Cecil “Chuck” Leith told Edwards in an oral history.
The Department of Energy writ large, which inherited much of the responsibilities of the Atomic Energy Commission, is now one of the lead agencies on climate change policy and energy research.
Which brings us to fusion.
It was Teller’s Lawrence Livermore National Laboratory that earlier this year successfully got more power out of a controlled fusion reaction than it put in — and it was Energy Secretary Jennifer Granholm who announced it, calling it the “holy grail” of clean energy development.
Teller’s journey with fusion is familiar to its history: early cautious optimism followed by a realization that it would likely not be achieved soon. As early as 1958, he said in a speech that he had been discussing “controlled fusion” at Los Alamos and that “thermonuclear energy generation is possible,” although he admitted that “the problem is not quite easy” and by 1987 had given up on seeing it realized during his lifetime.
Still, what controlled fusion we do have at Livermore’s National Ignition Facility owes something to Teller and the technology he pioneered in the hydrogen bomb, according to physicist NJ Fisch.
While fusion is one infamous technological fix for the problem of clean and cheap energy production, Teller and the Livermore cadres were also a major influence on the development of solar geoengineering, the idea that global warming could be averted not by reducing the emissions of greenhouse gas into the atmosphere, but by making the sun less intense.
In a mildly trolling column for the Wall Street Journal in January 1998, Teller professed agnosticism on climate change (despite giving that speech to oil executives three decades prior) but proposed an alternative policy that would be “far less burdensome than even a system of market-allocated emissions permits”: solar geoengineering with “fine particles.”
The op-ed placed in the conservative pages of the Wall Street Journal was almost certainly an effort to oppose the recently signed Kyoto Protocol, but the ideas have persisted among thinkers and scientists whose engagement with environmental issues went far beyond their own opinion about Al Gore and by extension the environmental movement as a whole (Teller’s feelings about both were negative).
But his proposal would be familiar to the climate debates of today: particle emissions that would scatter sunlight and thus lower atmospheric temperatures. If climate change had to be addressed, Teller argued, “let us play to our uniquely American strengths in innovation and technology to offset any global warming by the least costly means possible.”
A paper he wrote with two colleagues that was an early call for spraying sulfates in the stratosphere also proposed “deploying electrically-conducting sheeting, either in the stratosphere or in low Earth orbit.” These were “literally diaphanous shattering screens,” that could scatter enough sunlight in order to reduce global warming — one calculation Teller made concludes that 46 million square miles, or about 1 percent of the surface area of the Earth, of these screens would be necessary.
The climate scientist and Livermore alumnus Ken Caldeira has attributed his own initial interest in solar geoengineering to Lowell Wood, a Livermore researcher and Teller protégé. While often seen as a centrist or even a right wing idea in order to avoid the more restrictionist policies on carbon emissions, solar geoengineering has sparked some interest on the left, including in socialist science fiction author Kim Stanley Robinson’s The Ministry for the Future, which envisions India unilaterally pumping sulfates into the atmosphere in response to a devastating heat wave.
The White House even quietly released a congressionally-mandated report on solar geoengineering earlier this spring, outlining avenues for further research.
While the more than 30 years since the creation of the Intergovernmental Panel on Climate Change and the beginnings of Kyoto Protocol have emphasized international cooperation on both science and policymaking through agreed upon goals in emissions reductions, the technological temptation is always present.
And here we can perhaps see that the split between the moralized scientists and their pleas for addressing the problems of the arms race through scientific openness and international cooperation and those of the hawkish technicians, who wanted to press the United States’ technical advantage in order to win the nuclear standoff and ultimately the Cold War through deterrence.
With the IPCC and the United Nations Climate Conference, through which emerged the Kyoto Protocol and the Paris Agreement, we see a version of what the postwar scientists wanted applied to the problem of climate change. Nations come together and agree on targets for controlling something that may benefit any one of them but risks global calamity. The process is informed by scientists working with substantial resources across national borders who play a major role in formulating and verifying the policy mechanisms used to achieve these goals.
But for almost as long as climate change has been an issue of international concern, the Tellerian path has been tempting. While Teller’s dreams of massive sun-scattering sheets, nuclear earth engineering, and automated underground reactors are unlikely to be realized soon, if at all, you can be sure there are scientists and engineers looking straight into the light. And they may one day drag us into it, whether we want to or not.
Editor’s note: An earlier version of this article misstated the name of a climate modeler. It’s been corrected. We regret the error.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Republicans have blamed Democrats for unleashing Russ Vought on federal spending. But it doesn’t take much to see a bigger plan at work.
Russ Vought, the director of the Office of Management and Budget, has been waiting for this moment his whole adult life — or that’s what President Trump and the Republican Congressional leadership would like you to believe. As they put it, Vought is a fanatical budget cutter who, once unleashed, cannot be controlled. Who knows what he’ll cut if the Democrats continue to keep the government shut down?
Substantial staffing cuts that go beyond the typical shutdown furloughs are “the risk of shutting down the government and handing the keys to Russ Vought,” Senate Majority Leader John Thune told Politico on Thursday. “We don’t control what he’s going to do.”
House Speaker Mike Johnson told reporters Thursday morning that Democrats “have now, effectively, turned off the legislative branch,” and have “turned it over to the executive.”
“I have a meeting today with Russ Vought, he of PROJECT 2025 Fame, to determine which of the many Democrat Agencies, most of which are a political SCAM, he recommends to be cut, and whether or not those cuts will be temporary or permanent,” Trump wrote Thursday on Truth Social. “I can’t believe the Radical Left Democrats gave me this unprecedented opportunity.”
In short, any cuts — even ones some Republicans might find distasteful — are the Democrats’ fault, according to Republican leadership.
This is not the first time we’ve seen an eager budget cutter ascend to power in this administration. Let’s take a moment to flash back to the very first days and months of Trump’s second presidency, when young staffers from Elon Musk’s Department of Government Efficiency were marching into government offices, demanding data and deleting programs.
Though he operated at the time with the full support of the president and spurred on by the enthusiasm of his supporters, Musk quickly ran into conflict with the people actually running the departments he had essentially appointed himself to oversee.
Musk and Treasury Secretary Scott Bessent got into “a heated shouting match in earshot of President Trump and other officials in the White House,” according to Axios, over leadership of the IRS. Musk and Secretary of State Marco Rubio got into an argument in front of Trump, The New York Times reported, when Musk accused Rubio of not firing enough people. Transportation Secretary Sean Duffy has gone public with his own account of a dispute with Musk over who had the authority to make staffing decisions in the Transportation Department, during which Duffy insisted that “we are not going to fire air traffic controllers,” he told the New York Post in August.
Musk also stirred up conflict with Vought himself. The Times reported that the OMB director “could barely contain his frustration” when Musk’s team exceeded his own plans for federal staffing cuts.
Bessent, Rubio, Duffy, and Vought are all still around. Musk is not. The cabinet secretaries and congressional leadership wrested back their prerogatives over federal spending and staffing, and some staffers that were let go have been hired back.
But the shutdown threatens to introduce a volatile new dynamic, in which another aggressive budget cutter in the highest echelons of the government — in this case, Vought — gets the upper hand without the intra-party blowback.
That’s because unlike Musk, the space entrepreneur and car manufacturer who had only recently become a Republican, Vought is a career conservative, whose command of the levers of power has been honed over years of experience in government. This may be Vought’s moment to make permanent changes in the size and structure of the federal government — or at least credibly threaten to do so — with particular attention to programs he views as a “cartel” between Congress and the federal bureaucracy, as well as spending programs that tend to advance progressive ends, including mitigating or preventing climate change.
Vought has been teeing up dramatic budget cuts and aggressive defunding maneuvers since the first Trump administration — it was his move to delay aid to Ukraine that resulted in Trump’s first impeachment. He then spent his four years in exile from power at a think tank he founded, expanding on his vision of a budgetary process more controlled by the executive branch.
But as my colleague Robinson Meyer wrote back in January, during the first Trump administration Vought would regularly draw up budgets that would feature dramatic cuts and then Republicans in Congress would undo them and spending would continue on in a bipartisan manner.
This time, Trump has gotten Voughtier, and Republicans in Congress have gotten more compliant. Vought has already said he wants to take the normally bipartisan appropriations process and turn into a partisan one, in part by letting the president control spending that’s authorized by Congress. Though the president and Republican leadership in Congress might want the public to see a budget director run amok, it’s clear that all of the above relish the prospect of Vought as a kind of wildcard, unleashed with a red pen on the federal budget.
Echoes of Vought’s ideology have made their way into policymaking across branches of government. The White House has already struck some foreign aid programs authorized by Congress, and the Supreme Court recently allowed those cuts to stand. Republicans in Congress passed a rescissions package that cut previously appropriated funding for public broadcasting and other foreign aid. Vought also effectively shuttered the Consumer Financial Protection Bureau, a formerly independent agency, while cuts to the Department of Education have left it a shell of itself.
The cuts Vought has announced so far during the shutdown, including funding for a bunch of clean energy and sustainability projects largely in blue states and transit projects in New York, New Jersey, and Illinois, aren’t entirely shutdown-related. It doesn’t take a tremendous leap to arrive at the idea that they might have been planned all along and timed to punish Democrats.
At least some of the cuts seem to be intended to be permanent and would not revert when the shutdown inevitably ends. Secretary of Energy Chris Wright told CNN on Thursday that the grant cancellation decisions were made by the Department of Energy, and that “projects will not be restored” once the government is funded again.
It remains unclear the full extent of the cuts Vought will attempt to make, and how the judicial process will ultimately handle them. But the prospect of further major cuts — especially in contrast to the Republican offer of a continuing resolution to resolve the spending standoff — has raised eyebrows among at least a few congressional Republicans.
Kevin Cramer, a Republican senator from North Dakota, told Semafor that Vought is “less politically in tune than the president,” and that by using the shutdown to pursue large cuts, Republicans risk ceding the “moral high ground” in the shutdown fight. Susan Collins, the Maine moderate who chairs the Appropriations Committee, has also criticized some legally aggressive cuts.
But most in the majority, especially in leadership, have expressed no problem with Vought’s prospective cuts, or see them purely as something Democrats are responsible for due to failing to vote yes on their continuing resolution. Which could mean the cuts, if they come, could prove more enduring than Musk’s more slapdash efforts.
The shutdown could cement a shift in the balance of power between Vought and figures in the administration or Congress who are more cautious about the slash and burn approach. This may overwhelm any sense of caution from Cabinet secretaries or congressional leaders defending their turf. They’re all still Republicans at the end of the day.
A review of Heatmap Pro data reveals a troubling new trend in data center development.
Data centers are being built in places that restrict renewable energy. There are significant implications for our future energy grid – but it’s unclear if this behavior will lead to tech companies eschewing renewables or finding novel ways to still meet their clean energy commitments.
In the previous edition of The Fight, I began chronicling the data center boom and a nascent backlash to it by talking about Google and what would’ve been its second data center in southern Indianapolis, if the city had not rejected it last Monday. As I learned about Google’s practices in Indiana, I focused on the company’s first project – a $2 billion facility in Fort Wayne, because it is being built in a county where officials have instituted a cumbersome restrictive ordinance on large-scale solar energy. The county commission recently voted to make the ordinance more restrictive, unanimously agreeing to institute a 1,000-foot setback to take effect in early November, pending final approval from the county’s planning commission.
As it turns out, the Fort Wayne data center is not an exception: Approximately 44% of all data centers proposed in Indiana are in counties that have restricted or banned new renewable energy projects. This is according to a review of Heatmap Pro data in which we cross-referenced the county bans and ordinances we track against a list of proposed data centers prepared by an Indiana energy advocacy group, Citizens Action Coalition of Indiana.
This doesn’t necessarily mean the power going to these data centers is consistently fossil. Data centers can take years to construct and often rely on power fed to them from a distributed regional energy grid. But this does mean it would be exceptionally costly for any of these projects to build renewable generation on site, as a rising number of projects choose to do – not to mention that on a macro level, data centers may increasingly run up against the same cultural dynamics that are leading to solar and wind project denials. (See: this local news article about the Fort Wayne data center campus).
Chrissy Moy, a Google spokesperson, told me the Fort Wayne facility will get its power off of the PJM grid, and sent me links to solar projects and hydroelectric facilities in other states on the PJM it has power purchase agreements with. I’d note the company claims it “already matches” all of its global annual electricity demand with “renewable energy purchases.” What this means is that if Google can’t generate renewable energy for a data center directly, it will try to procure renewable energy at the same time from the same grid, even if it can’t literally use that clean power at that data center. And if that's not possible, it will search farther afield or at different times. (Google is one of the more aggressive big tech companies in this regard, as my colleague Emily Pontecorvo details.) Google has also boasted that it will provide an undisclosed amount of excess clean electricity through rights transfers to Indiana Michigan Power when the tech company’s load is low and demand on the broader grid is peaking, as part of Google’s broader commitment to grid flexibility.
I reached out to Tom Wilson, an energy systems technical executive at the Electric Power Research Institute, an industry-focused organization that studies modern power and works with tech companies on flexible data center energy use, including Google. Wilson told me that in Indiana, many of the siting decisions for data centers were made before counties enacted moratoria against renewable energy and that tech companies may not always be knowingly siting projects in places where significant solar or wind generation would be impractical or even impossible. (We would just note that Fort Wayne, Indiana, has an opposition risk score of 84 in Heatmap Pro, meaning it would have been a very risky place to build a renewable energy project even without that restrictive ordinance.) It also indicates some areas may be laying down renewables restrictions after seeing data center development, which is in line with a potential land use techlash.
Wilson told me that two thirds of data centers rely on power from the existing energy grid whereas surveys indicate about a third choose to have at least some electricity generation on site. In at least the latter case, land use constraints and permitting problems really can be a hurdle for building renewable energy close to where data is processed. This is a problem exacerbated when centers are developed near population centers, which Wilson said is frequently the case because companies want to reduce “latency” for customers. In other words, they want to “reduce the time it takes to get answers to people” via artificial intelligence or other data products.
“The primary challenges are the size of the data center and the amount of space it takes to build renewables,” he said. “They are moving from 20 megawatt or 40 megawatt data centers to 100, 200, 300 megawatt data centers. It’s really hard to locate that much renewable [energy] right near a population center. So that requires transmission, and unfortunately right now in the U.S. and in many other countries, transmission takes a significant amount of time to build.”
The majority of data centers are served by regional power grids, Wilson told me. Companies like Google, Meta, and others continue to invest in renewable energy procurement while building facilities in areas that have restricted new solar or wind power infrastructure. In some cases, companies may feel they’re forced to seek these places out because the land is just plain cheap and has existing fiber optic cable networks.
At the same time, there are large data centers getting energy generated on site, and how they each approach their energy sources varies. It’s also not always consistent.
For instance, Meta’s new Prometheus supercluster complex in New Albany, Ohio — potentially the world’s first 1 gigawatt data center — will reportedly have a significant amount of new gas power generation constructed at the facility, even though the company also struck a deal with Invenergy over the summer to procure at least 400 megawatts of solar from two projects in Ohio that already have their permits. One is in Clinton County and was fully permitted but resulted in a years-long fight before the Ohio Power Siting Board and included conservative media backlash. The other is in Franklin County and got its permits in 2021, before a recent wave of opposition against solar projects. Prometheus itself will be sited on the Licking County side of New Albany, where solar has been extremely difficult to build, even though most of this Columbus suburb is in solar-supporting Franklin.
Meanwhile, Elon Musk’s xAI data center notoriously relies on a polluting gas plant in Memphis, Tennessee. The surrounding Shelby County had a solar moratorium until mere months ago that residents want to bring back. An affiliate company of xAI used for the project’s real estate is subleasing land near the data center for a solar farm, but it is unclear right now if it’ll power the data center.
In the end, it really does seem like data centers are being sited in places with renewable energy restrictions. What the data center developers plan to do about it — if anything — is still an open question.
Current conditions: After walloping Bermuda with winds of up to 100 miles per hour, Hurricane Imelda is veering northeast away from the United States • While downgraded from a hurricane, Humberto is set to soak Ireland and the United Kingdom as Storm Amy in the coming days and bring winds of up to 90 miles per hour • Typhoon Matmo is strengthening as it hits the Philippines and barrels toward China.
The Department of Energy is canceling two regional hydrogen hubs in California and the Pacific Northwest as part of a broader rescinding of 321 grants worth $7.5 billion for projects nationwide. Going after the hydrogen hubs, which the oil and gas industry lobbied to keep open after President Donald Trump came back to office, “leaves the agency’s intentions for the remaining five hubs scattered throughout the Midwest, Midatlantic, Appalachia, the Great Plains, and Texas unclear,” Heatmap’s Emily Pontecorvo wrote yesterday.
The list of canceled projects that Emily got her hands on “does seem to confirm that blue state grants were the hardest hit,” she wrote. But, she found, “many would actually have benefitted Republican strongholds,” including a $20 million grant for a manufacturing plant in Texas that was slated to create 200 jobs.
Tesla’s global deliveries rose 7% in the third quarter, hitting a new record as Americans rushed to buy electric vehicles before the federal tax credit expired on September 30. The automaker delivered 497,099 vehicles in the three months leading up to that date, up from 462,890 in the same period last year, according to the Financial Times. That was well above analyst forecasts of 444,000.
That may do little to turn around the headwinds blasting the EV giant. While the company benefited from buyers scrambling to tap the federal EV tax credit, Tesla sank to its lowest-ever share of the electric vehicle market in August as drivers flocked to offerings from other automakers. It’s not just a problem in the U.S. As Heatmap’s Matthew Zeitlin wrote last month, “Thanks to CEO Elon Musk’s association with right wing politics in the U.S. and abroad, and to fierce competition from Chinese EV leader BYD, Tesla’s sales have fallen dramatically in Europe. Globally, BYD overtook Tesla in sales last year.”
Get Heatmap AM directly in your inbox every morning:
Conservative leader Kemi Badenoch. Dan Kitwood/Getty Images
Kemi Badenoch, the leader of the British Conservatives, has vowed to repeal the United Kingdom’s landmark climate law if her party, colloquially known as the Tories, wins the next election. Eliminating the Climate Change Act, passed almost unanimously under a Tory government in 2008, would dismantle controls on greenhouse gas emissions and remove what The Guardian described as “the cornerstone of green and energy policy for successive governments” for the past 17 years.
The move rankled past Tory leaders. Former Prime Minister Theresa May condemned the campaign pledge as a “catastrophic mistake.” Calling it a “retrograde” step, she said that “while consensus is being tested, the science remains the same.” Alok Sharma, the former Tory minister who led the COP26 climate summit in Glasgow in 2021, told The Guardian in a separate article that a repeal risked “many tens of billions of pounds of private sector investment and accompanying jobs.”
Sea ice in Antarctica reached its third-smallest winter peak extent since satellite records began 47 years ago, according to a new analysis by Carbon Brief. Provisional data from the U.S. National Snow and Ice Data Center showed Antarctic sea ice reaching a winter maximum of just under 6.9 million square miles as of September 17. That’s nearly 350,000 square miles below the average between 1981 and 2010, the historical baseline against which recent changes in sea ice extent are compared. The “lengthening trend of lower Antarctic sea ice poses real concerns regarding stability and melting of the ice sheet,” one expert told the publication.
The finding comes as a “groundbreaking” study the European Geosciences Union published Thursday in the journal Earth System Dynamics found that Antarctic sea ice has emerged as a key predictor of accelerated ocean warming. Using Earth system models and satellite images from 1980 to 2020, the researchers found higher sea ice extent enhances cloud cover, which has a cooling effect overall by reducing incoming solar radiation. As a result, ongoing sea ice loss is linked to larger reductions in clouds, stronger surface warming, and even more ocean heat uptake, accelerating the cycle.
Duke Energy plans to meet surging demand for electricity by increasing its natural gas and battery capacity, keeping coal plants open for up to four years longer than previously estimated, and evaluating new sites for nuclear reactors. The 100-page biennial proposal published this week dials back plans for more renewables such as wind and solar. It also pushed back the earliest start date for a new reactor to 2037, declined to commit to either small modular reactors or large traditional units, and said the utility still needs extra protections against cost overruns before embarking on construction.
In the meantime, the added years of coal burning “will result in millions of tons in additional greenhouse gases over the next decade when combined with other proposed changes to the utility’s fuel mix,” Inside Climate News reported. In a statement to Axios, North Carolina Governor Josh Stein, a Democrat, called on the state’s utilities commission to “require significant changes” and condemned Duke for “retreating from the state’s clean energy future.”
New research by a team of scientists from the U.K. and New Zealand has found that new analytical methods could bolster conservation breeding programs by offering a better understanding of why eggs don’t hatch. The researchers used fluorescent dyes to discover that nearly 66% of 174 unhatched eggs examined in the study had been fertilized, whereas previous methods suggested that only 5.2% had been fertilized. “There are many different factors that contribute to breeding success,” Gary Ward, a co-author from the London-based ZSL Institute of Zoology, said in a statement, “and the more understanding we can have into why an egg might not hatch, the more we can refine our care for these birds and the better chance of recovery we can give them.”