You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
Is international cooperation or technological development the answer to an apocalyptic threat?
Christopher Nolan’s film Oppenheimer is about the great military contest of the Second World War, but only in the background. It’s really about a clash of visions for a postwar world defined by the physicist J. Robert Oppenheimer’s work at Los Alamos and beyond. The great power unleashed by the bombs at Hiroshima and Nagasaki could be dwarfed by what knowledge of nuclear physics could produce in the coming years, risking a war more horrifying than the one that had just concluded.
Oppenheimer, and many of his fellow atomic scientists, would spend much of the postwar period arguing for international cooperation, scientific openness, and nuclear restriction. But there was another cadre of scientists, exemplified by a former colleague turned rival, Edward Teller, that sought to answer the threat of nuclear annihilation with new technology — including even bigger bombs.
As the urgency of the nuclear question declined with the end of the Cold War, the scientific community took up a new threat to global civilization: climate change. While the conflict mapped out in Oppenheimer was over nuclear weapons, the clash of visions, which ended up burying Oppenheimer and elevating Teller, also maps out to the great debate over global warming: Should we reach international agreements to cooperatively reduce carbon emissions or should we throw our — and specifically America’s — great resources into a headlong rush of technological development? Should we massively overhaul our energy system or make the sun a little less bright?
Oppenheimer’s dream of international cooperation to prevent a nuclear arms race was born even before the Manhattan Project culminated with the Trinity test. Oppenheimer and Danish physicist Niels Bohr “believed that an agreement between the wartime allies based upon the sharing of information, including the existence of the Manhattan Project, could prevent the surfacing of a nuclear-armed world,” writes Marco Borghi in a Wilson Institute working paper.
Oppenheimer even suggested that the Soviets be informed of the Manhattan Project’s efforts and, according to Martin Sherwin and Kai Bird’s American Prometheus, had “assumed that such forthright discussions were taking place at that very moment” at the conference in Potsdam where, Oppenheimer “was later appalled to learn” that Harry Truman had only vaguely mentioned the bomb to Joseph Stalin, scotching the first opportunity for international nuclear cooperation.
Oppenheimer continued to take up the cause of international cooperation, working as the lead advisor for Dean Acheson and David Lilienthal on their 1946 nuclear control proposal, which would never get accepted by the United Nations and, namely, the Soviet Union after it was amended by Truman’s appointed U.N. representative Bernard Baruch to be more favorable to the United States.
In view of the next 50 years of nuclear history — further proliferation, the development of thermonuclear weapons that could be mounted on missiles that were likely impossible to shoot down — the proposals Oppenheimer developed seem utopian: The U.N. would "bring under its complete control world supplies of uranium and thorium," including all mining, and would control all nuclear reactors. This scheme would also make the construction of new weapons impossible, lest other nations build their own.
By the end of 1946, the Baruch proposal had died along with any prospect of international control of nuclear power, all the while the Soviets were working intensely to disrupt America’s nuclear monopoly — with the help of information ferried out of Los Alamos — by successfully testing a weapon before the end of the decade.
With the failure of international arms control and the beginning of the arms race, Oppenheimer’s vision of a post-Trinity world would come to shambles. For Teller, however, it was a great opportunity.
While Oppenheimer planned to stave off nuclear annihilation through international cooperation, Teller was trying to build a bigger deterrent.
Since the early stages of the Manhattan Project, Teller had been dreaming of a fusion weapon many times more powerful than the first atomic bombs, what was then called the “Super.” When the atomic bomb was completed, he would again push for the creation of a thermonuclear bomb, but the efforts stalled thanks to technical and theoretical issues with Teller’s proposed design.
Nolan captures Teller’s early comprehension of just how powerful nuclear weapons can be. In a scene that’s pulled straight from accounts of the Trinity blast, most of the scientists who view the test are either in bunkers wearing welding goggles or following instructions to lie down, facing away from the blast. Not so for Teller. He lathers sunscreen on his face, straps on a pair of dark goggles, and views the explosion straight on, even pursing his lips as the explosion lights up the desert night brighter than the sun.
And it was that power — the sun’s — that Teller wanted to harness in pursuit of his “Super,” where a bomb’s power would be derived from fusing together hydrogen atoms, creating helium — and a great deal of energy. It would even use a fission bomb to help ignite the process.
Oppenheimer and several scientific luminaries, including Manhattan Project scientists Enrico Fermi and Isidor Rabi, opposed the bomb, issuing in their official report on their positions advising the Atomic Energy Commission in 1949 statements that the hydrogen bomb was infeasible, strategically useless, and potentially a weapon of “genocide.”
But by 1950, thanks in part to Teller and the advocacy of Lewis Strauss, a financier turned government official and the approximate villain of Nolan’s film, Harry Truman would sign off on a hydrogen bomb project, resulting in the 1952 “Ivy Mike” test where a bomb using a design from Teller and mathematician Stan Ulam would vaporize the Pacific Island Elugelab with a blast about 700 times more powerful than the one that destroyed Hiroshima.
The success of the project re-ignited doubts around Oppenheimer’s well-known left-wing political associations in the years before the war and, thanks to scheming by Strauss, he was denied a renewed security clearance.
While several Manhattan Project scientists testified on his behalf, Teller did not, saying, “I thoroughly disagreed with him in numerous issues and his actions frankly appeared to me confused and complicated.”
It was the end of Oppenheimer’s public career. The New Deal Democrat had been eclipsed by Teller, who would become the scientific avatar of the Reagan Republicans.
For the next few decades, Teller would stay close to politicians, the military, and the media, exercising a great deal of influence over arms policy for several decades from the Lawrence Livermore National Laboratory, which he helped found, and his academic perch at the University of California.
He pooh-poohed the dangers of radiation, supported the building of more and bigger bombs that could be delivered by longer and longer range missiles, and opposed prohibitions on testing. When Dwight Eisenhower was considering a negotiated nuclear test ban, Teller faced off against future Nobel laureate and Manhattan Project alumnus Hans Bethe over whether nuclear tests could be hidden from detection by conducting them underground in a massive hole; the eventual 1963 test ban treaty would exempt underground testing.
As the Cold War settled into a nuclear standoff with both the United States and the Soviet Union possessing enough missiles and nuclear weapons to wipe out the other, Teller didn’t look to treaties, limitations, and cooperation to solve the problem of nuclear brinksmanship, but instead to space: He wanted to neutralize the threat of a Soviet first strike using x-ray lasers from space powered by nuclear explosions (he was again opposed by Bethe and the x-ray lasers never came to fruition).
He also notoriously dreamed up Project Plowshare, the civilian nuclear project which would get close to nuking out a new harbor in Northern Alaska and actually did attempt to extract gas in New Mexico and Colorado using nuclear explosions.
Yet, in perhaps the strangest turn of all, Teller also became something of a key figure in the history of climate change research, both in his relatively early awareness of the problem and the conceptual gigantism he brought to proposing to solve it.
While publicly skeptical of climate change later in his life, Teller was starting to think about climate change, decades before James Hansen’s seminal 1988 Congressional testimony.
The researcher and climate litigator Benajmin Franta made the startling archival discovery that Teller had given a speech at an oil industry event in 1959 where he warned “energy resources will run short as we use more and more of the fossil fuels,” and, after explaining the greenhouse effect, he said that “it has been calculated that a temperature rise corresponding to a 10 percent increase in carbon dioxide will be sufficient to melt the icecap and submerge New York … I think that this chemical contamination is more serious than most people tend to believe.”
Teller was also engaged with issues around energy and other “peaceful” uses of nuclear power. In response to concerns about the dangers of nuclear reactors, he in the 1960s began advocating putting them underground, and by the early 1990s proposed running said underground nuclear reactors automatically in order to avoid the human error he blamed for the disasters at Chernobyl and Three Mile Island.
While Teller was always happy to find some collaborators to almost throw off an ingenious-if-extreme solution to a problem, there is a strain of “Tellerism,” both institutionally and conceptually, that persists to this day in climate science and energy policy.
Nuclear science and climate science had long been intertwined, Stanford historian Paul Edwards writes, including that the “earliest global climate models relied on numerical methods very similar to those developed by nuclear weapons designers for solving the fluid dynamics equations needed to analyze shock waves produced in nuclear explosions.”
Where Teller comes in is in the role that Lawrence Livermore played in both its energy research and climate modeling. “With the Cold War over and research on nuclear weapons in decline, the national laboratories faced a quandary: What would justify their continued existence?” Edwards writes. The answer in many cases would be climate change, due to these labs’ ample collection of computing power, “expertise in numerical modeling of fluid dynamics, and their skills in managing very large data sets.”
One of those labs was Livermore, the institution founded by Teller, a leading center of climate and energy modeling and research since the late 1980s. “[Teller] was very enthusiastic about weather control,” early climate modeler Cecil “Chuck” Leith told Edwards in an oral history.
The Department of Energy writ large, which inherited much of the responsibilities of the Atomic Energy Commission, is now one of the lead agencies on climate change policy and energy research.
Which brings us to fusion.
It was Teller’s Lawrence Livermore National Laboratory that earlier this year successfully got more power out of a controlled fusion reaction than it put in — and it was Energy Secretary Jennifer Granholm who announced it, calling it the “holy grail” of clean energy development.
Teller’s journey with fusion is familiar to its history: early cautious optimism followed by a realization that it would likely not be achieved soon. As early as 1958, he said in a speech that he had been discussing “controlled fusion” at Los Alamos and that “thermonuclear energy generation is possible,” although he admitted that “the problem is not quite easy” and by 1987 had given up on seeing it realized during his lifetime.
Still, what controlled fusion we do have at Livermore’s National Ignition Facility owes something to Teller and the technology he pioneered in the hydrogen bomb, according to physicist NJ Fisch.
While fusion is one infamous technological fix for the problem of clean and cheap energy production, Teller and the Livermore cadres were also a major influence on the development of solar geoengineering, the idea that global warming could be averted not by reducing the emissions of greenhouse gas into the atmosphere, but by making the sun less intense.
In a mildly trolling column for the Wall Street Journal in January 1998, Teller professed agnosticism on climate change (despite giving that speech to oil executives three decades prior) but proposed an alternative policy that would be “far less burdensome than even a system of market-allocated emissions permits”: solar geoengineering with “fine particles.”
The op-ed placed in the conservative pages of the Wall Street Journal was almost certainly an effort to oppose the recently signed Kyoto Protocol, but the ideas have persisted among thinkers and scientists whose engagement with environmental issues went far beyond their own opinion about Al Gore and by extension the environmental movement as a whole (Teller’s feelings about both were negative).
But his proposal would be familiar to the climate debates of today: particle emissions that would scatter sunlight and thus lower atmospheric temperatures. If climate change had to be addressed, Teller argued, “let us play to our uniquely American strengths in innovation and technology to offset any global warming by the least costly means possible.”
A paper he wrote with two colleagues that was an early call for spraying sulfates in the stratosphere also proposed “deploying electrically-conducting sheeting, either in the stratosphere or in low Earth orbit.” These were “literally diaphanous shattering screens,” that could scatter enough sunlight in order to reduce global warming — one calculation Teller made concludes that 46 million square miles, or about 1 percent of the surface area of the Earth, of these screens would be necessary.
The climate scientist and Livermore alumnus Ken Caldeira has attributed his own initial interest in solar geoengineering to Lowell Wood, a Livermore researcher and Teller protégé. While often seen as a centrist or even a right wing idea in order to avoid the more restrictionist policies on carbon emissions, solar geoengineering has sparked some interest on the left, including in socialist science fiction author Kim Stanley Robinson’s The Ministry for the Future, which envisions India unilaterally pumping sulfates into the atmosphere in response to a devastating heat wave.
The White House even quietly released a congressionally-mandated report on solar geoengineering earlier this spring, outlining avenues for further research.
While the more than 30 years since the creation of the Intergovernmental Panel on Climate Change and the beginnings of Kyoto Protocol have emphasized international cooperation on both science and policymaking through agreed upon goals in emissions reductions, the technological temptation is always present.
And here we can perhaps see that the split between the moralized scientists and their pleas for addressing the problems of the arms race through scientific openness and international cooperation and those of the hawkish technicians, who wanted to press the United States’ technical advantage in order to win the nuclear standoff and ultimately the Cold War through deterrence.
With the IPCC and the United Nations Climate Conference, through which emerged the Kyoto Protocol and the Paris Agreement, we see a version of what the postwar scientists wanted applied to the problem of climate change. Nations come together and agree on targets for controlling something that may benefit any one of them but risks global calamity. The process is informed by scientists working with substantial resources across national borders who play a major role in formulating and verifying the policy mechanisms used to achieve these goals.
But for almost as long as climate change has been an issue of international concern, the Tellerian path has been tempting. While Teller’s dreams of massive sun-scattering sheets, nuclear earth engineering, and automated underground reactors are unlikely to be realized soon, if at all, you can be sure there are scientists and engineers looking straight into the light. And they may one day drag us into it, whether we want to or not.
Editor’s note: An earlier version of this article misstated the name of a climate modeler. It’s been corrected. We regret the error.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
New research out today shows a 10-fold increase in smoke mortality related to climate change from the 1960s to the 2010.
If you are one of the more than 2 billion people on Earth who have inhaled wildfire smoke, then you know firsthand that it is nasty stuff. It makes your eyes sting and your throat sore and raw; breathe in smoke for long enough, and you might get a headache or start to wheeze. Maybe you’ll have an asthma attack and end up in the emergency room. Or maybe, in the days or weeks afterward, you’ll suffer from a stroke or heart attack that you wouldn’t have had otherwise.
Researchers are increasingly convinced that the tiny, inhalable particulate matter in wildfire smoke, known as PM2.5, contributes to thousands of excess deaths annually in the United States alone. But is it fair to link those deaths directly to climate change?
A new study published Monday in Nature Climate Change suggests that for a growing number of cases, the answer should be yes. Chae Yeon Park, a climate risk modeling researcher at Japan’s National Institute for Environmental Studies, looked with her colleagues at three fire-vegetation models to understand how hazardous emissions changed from 1960 to 2019, compared to a hypothetical control model that excluded historical climate change data. They found that while fewer than 669 deaths in the 1960s could be attributed to climate change globally, that number ballooned to 12,566 in the 2010s — roughly a 20-fold increase. The proportion of all global PM2.5 deaths attributable to climate change jumped 10-fold over the same period, from 1.2% in the 1960s to 12.8% in the 2010s.
“It’s a timely and meaningful study that informs the public and the government about the dangers of wildfire smoke and how climate change is contributing to that,” Yiqun Ma, who researches the intersection of climate change, air pollution, and human health at the Yale School of Medicine, and who was not involved in the Nature study, told me.
The study found the highest climate change-attributable fire mortality values in South America, Australia, and Europe, where increases in heat and decreases in humidity were also the greatest. In the southern hemisphere of South America, for example, the authors wrote that fire mortalities attributable to climate change increased from a model average of 35% to 71% between the 1960s and 2010s, “coinciding with decreased relative humidity,” which dries out fire fuels. For the same reason, an increase in relative humidity lowered fire mortality in other regions, such as South Asia. North America exhibited a less dramatic leap in climate-related smoke mortalities, with climate change’s contribution around 3.6% in the 1960s, “with a notable rise in the 2010s” to 18.8%, Park told me in an email.
While that’s alarming all on its own, Ma told me there was a possibility that Park’s findings might actually be too conservative. “They assume PM2.5 from wildfire sources and from other sources” — like from cars or power plants — “have the same toxicity,” she explained. “But in fact, in recent studies, people have found PM2.5 from fire sources can be more toxic than those from an urban background.” Another reason Ma suspected the study’s numbers might be an underestimate was because the researchers focused on only six diseases that have known links to PM2.5 exposure: chronic obstructive pulmonary disease, lung cancer, coronary heart disease, type 2 diabetes, stroke, and lower respiratory infection. “According to our previous findings [at the Yale School of Medicine], other diseases can also be influenced by wildfire smoke, such as mental disorders, depression, and anxiety, and they did not consider that part,” she told me.
Minghao Qiu, an assistant professor at Stony Brook University and one of the country’s leading researchers on wildfire smoke exposure and climate change, generally agreed with Park’s findings, but cautioned that there is “a lot of uncertainty in the underlying numbers” in part because, intrinsically, wildfire smoke exposure is such a complicated thing to try to put firm numbers to. “It’s so difficult to model how climate influences wildfire because wildfire is such an idiosyncratic process and it’s so random, ” he told me, adding, “In general, models are not great in terms of capturing wildfire.”
Despite their few reservations, both Qiu and Ma emphasized the importance of studies like Park’s. “There are no really good solutions” to reduce wildfire PM2.5 exposure. You can’t just “put a filter on a stack” as you (sort of) can with power plant emissions, Qiu pointed out.
Even prescribed fires, often touted as an important wildfire mitigation technique, still produce smoke. Park’s team acknowledged that a whole suite of options would be needed to minimize future wildfire deaths, ranging from fire-resilient forest and urban planning to PM2.5 treatment advances in hospitals. And, of course, there is addressing the root cause of the increased mortality to begin with: our warming climate.
“To respond to these long-term changes,” Park told me, “it is crucial to gradually modify our system.”
On the COP16 biodiversity summit, Big Oil’s big plan, and sea level rise
Current conditions: Record rainfall triggered flooding in Roswell, New Mexico, that killed at least two people • Storm Ashley unleashed 80 mph winds across parts of the U.K. • A wildfire that broke out near Oakland, California, on Friday is now 85% contained.
Forecasters hadn’t expected Hurricane Oscar to develop into a hurricane at all, let alone in just 12 hours. But it did. The Category 1 storm made landfall in Cuba on Sunday, hours after passing over the Bahamas, bringing intense rain and strong winds. Up to a foot of rainfall was expected. Oscar struck while Cuba was struggling to recover from a large blackout that has left millions without power for four days. A second system, Tropical Storm Nadine, made landfall in Belize on Saturday with 60 mph winds and then quickly weakened. Both Oscar and Nadine developed in the Atlantic on the same day.
Hurricane OscarAccuWeather
The COP16 biodiversity summit starts today in Cali, Colombia. Diplomats from 190 countries will try to come up with a plan to halt global biodiversity loss, aiming to protect 30% of land and sea areas and restore 30% of degraded ecosystems by 2030. Discussions will revolve around how to monitor nature degradation, hold countries accountable for their protection pledges, and pay for biodiversity efforts. There will also be a big push to get many more countries to publish national biodiversity strategies. “This COP is a test of how serious countries are about upholding their international commitments to stop the rapid loss of biodiversity,” said Crystal Davis, Global Director of Food, Land, and Water at the World Resources Institute. “The world has no shot at doing so without richer countries providing more financial support to developing countries — which contain most of the world’s biodiversity.”
A prominent group of oil and gas producers has developed a plan to roll back environmental rules put in place by President Biden, The Washington Post reported. The paper got its hands on confidential documents from the American Exploration and Production Council (AXPC), which represents some 30 producers. The documents include draft executive orders promoting fossil fuel production for a newly-elected President Trump to sign if he takes the White House in November, as well as a roadmap for dismantling many policies aimed at getting oil and gas producers to disclose and curb emissions. AXPC’s members, including ExxonMobil, ConocoPhillips, and Hess, account for about half of the oil and gas produced in the U.S., the Post reported.
A new report from the energy think tank Ember looks at how the uptake of electric vehicles and heat pumps in the U.K. is affecting oil and gas consumption. It found that last year the country had 1.5 million EVs on the road, and 430,000 residential heat pumps in homes, and the reduction in fossil fuel use due to the growth of these technologies was equivalent to 14 million barrels of oil, or about what the U.K. imports over a two-week span. This reduction effect will be even stronger as more and more EVs and heat pumps are powered by clean energy. The report also found that even though power demand is expected to rise, efficiency gains from electrification and decarbonization will make up for this, leading to an overall decline in energy use and fossil fuel consumption.
Ember
The world’s sea levels are projected to rise by more than 6 inches on average over the next 30 years if current trends continue, according to a new study published in the journal Nature. “Such rates would represent an evolving challenge for adaptation efforts,” the authors wrote. By examining satellite data, the researchers found that sea levels have risen by about .4 inches since 1993, and that they’re rising faster now than they were then. In 1993 the seas were rising by about .08 inches per year, and last year they were rising at .17 inches per year. These are averages, of course, and some areas are seeing much more extreme changes. For example, areas around Miami, Florida, have already seen sea levels rise by 6 inches over the last 31 years.
“As the climate crisis grows more urgent, restoring faith in government will be more important than ever.” –Paul Waldman writing for Heatmap about the profound implications of America becoming a low-trust society.
That means big, bad things for disaster relief — and for climate policy in general.
When Hurricanes Helene and Milton swept through the Southeast, small-government conservatives demanded fast and effective government service, in the form of relief operations organized by the Federal Emergency Management Agency. Yet even as the agency was scrambling to meet the need, it found itself targeted by far-right militias, who prevented it from doing its job because they had been led by cynical politicians to believe it wasn't doing its job.
It’s almost a law of nature, or at least of politics, that when government does its job, few people notice — only when it screws up does everyone pay attention. While this is nothing new in itself, it has increasingly profound implications for the future of government-driven climate action. While that action comes in many forms and can be sold to the public in many ways, it depends on people having faith that when government steps in — whether to create new regulations, invest in new technologies, or provide benefits for climate-friendly choices — it knows what it’s doing and can accomplish its goals.
As the climate crisis grows more urgent, restoring faith in government will be more important than ever. Unfortunately, simply doing the right things — like responding competently to disasters — won’t be enough to convince people that the next climate initiative will do what it’s supposed to.
The number of people expressing faith in government today is nearly as low as it has been in the half-century pollsters have been asking the question. That trust has bounced up and down a bit — it rose after September 11, then fell again during the disastrous Iraq War — but for the last decade and half, only around 20% of Americans say they trust the government most of the time.
It’s partisan, of course: People express more trust when their party controls the White House. And the decline of trust reaches beyond the government. Faith in most of the key institutions of American life — business, education, religion, news media — has fallen in recent decades, sometimes for good reason. The net result is a public skeptical that those in authority have the ability to solve complex problems.
Changing that perspective is extraordinarily difficult, often because of the nature of good and bad news: The former usually happens slowly and invisibly, while the latter often happens dramatically and all at once.
Take the program created in the Energy Department under George W. Bush to provide loans to innovative energy technologies. If most Americans had heard of it, it was because of one company: Solyndra, a manufacturer of innovative but overly expensive solar panels. Undercut by a decline in prices of traditional panels, the company went under, and its $535 million loan was never repaid. Republicans made Solyndra’s failure into a major controversy, claiming that the program showed that government investment in green technology was corrupt, ineffective, and wasteful.
What few people heard about was that the loan program overall not only turned a profit at the time (and for what it’s worth, it still does), it also provided help to many successful companies, even if a few failed — as any venture capital investor could tell you is inevitable. The successes included Tesla, which used its federal loan to ramp up production of the sedans that would turn it from a niche manufacturer of electric roadsters into what it is today. Needless to say, Elon Musk does not advertise the fact that his success was built on government help.
More recently, the hurricane response has shown how partisan polarization can be used to undermine trust in government — especially when Donald Trump is involved. Trump took the opportunity of the hurricanes to accuse the federal government of being both political and partisan, delivering help only to those areas that vote for Democrats. Soon after, he promised to do precisely what he falsely accused the Biden administration of doing, saying that if he is president again, he will withhold disaster aid from California unless Gov. Gavin Newsom changes the state’s water policies to be more to Trump’s liking. “And we’ll say, Gavin, if you don’t do it, we’re not giving any of that fire money that we send you all the time for all the fire, forest fires that you have,” Trump said. And in fact, in his first term Trump did try to withhold disaster aid from blue states.
What sounds like hypocrisy is actually something much more pernicious. As he often does, Trump is arguing not that he is clean and his opponents are dirty, but that everyone is dirty, and it’s just a question of whether government is in the hands of our team or their team. When he says he’ll “drain the swamp,” he’s telling people both that government is corrupt, and the answer is merely to change who gets the spoils. If you believe him, you’ll have no trust in government whatsoever, even if you might think he’ll use it in a way you’ll approve of.
We’ve seen again and again that people want government to perform well and get angry when it doesn’t, but they don’t reward competence when it happens. Which is why making sure systems operate properly and problems are solved is necessary but not sufficient to win back trust. Government’s advocates — especially those who are counting on it to undertake ambitious climate action both now and in the future — need not only to deliver, they have to get better at, for lack of a better word, propaganda. Policy success is not its own advertisement. And despite his ample policy achievements, Joe Biden has not been a charismatic and effective messenger — on the role of government, or much else.
Ronald Reagan used to say that the most frightening words in the English language were “I’m from the government and I’m here to help”; the oft-repeated quip was at the center of his incredibly successful effort to delegitimize government in the eyes of voters. To reverse the decline of trust so people will believe that government has the knowledge and ability to tackle climate change, the public needs to be reminded — often and repeatedly — of what government does well.
Touting past and present successes on climate — and disaster relief, and so many other ways the government solves problems every day — is essential to building support for future climate initiatives. Those successes are all around, it’s just that most people never hear about them or take them for granted. But promoting government as an engine of positive change should be as high a priority for climate advocates, including those who hold public office, as discrediting government was for Reagan and is for Trump.