You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
Is international cooperation or technological development the answer to an apocalyptic threat?
Christopher Nolan’s film Oppenheimer is about the great military contest of the Second World War, but only in the background. It’s really about a clash of visions for a postwar world defined by the physicist J. Robert Oppenheimer’s work at Los Alamos and beyond. The great power unleashed by the bombs at Hiroshima and Nagasaki could be dwarfed by what knowledge of nuclear physics could produce in the coming years, risking a war more horrifying than the one that had just concluded.
Oppenheimer, and many of his fellow atomic scientists, would spend much of the postwar period arguing for international cooperation, scientific openness, and nuclear restriction. But there was another cadre of scientists, exemplified by a former colleague turned rival, Edward Teller, that sought to answer the threat of nuclear annihilation with new technology — including even bigger bombs.
As the urgency of the nuclear question declined with the end of the Cold War, the scientific community took up a new threat to global civilization: climate change. While the conflict mapped out in Oppenheimer was over nuclear weapons, the clash of visions, which ended up burying Oppenheimer and elevating Teller, also maps out to the great debate over global warming: Should we reach international agreements to cooperatively reduce carbon emissions or should we throw our — and specifically America’s — great resources into a headlong rush of technological development? Should we massively overhaul our energy system or make the sun a little less bright?
Oppenheimer’s dream of international cooperation to prevent a nuclear arms race was born even before the Manhattan Project culminated with the Trinity test. Oppenheimer and Danish physicist Niels Bohr “believed that an agreement between the wartime allies based upon the sharing of information, including the existence of the Manhattan Project, could prevent the surfacing of a nuclear-armed world,” writes Marco Borghi in a Wilson Institute working paper.
Oppenheimer even suggested that the Soviets be informed of the Manhattan Project’s efforts and, according to Martin Sherwin and Kai Bird’s American Prometheus, had “assumed that such forthright discussions were taking place at that very moment” at the conference in Potsdam where, Oppenheimer “was later appalled to learn” that Harry Truman had only vaguely mentioned the bomb to Joseph Stalin, scotching the first opportunity for international nuclear cooperation.
Oppenheimer continued to take up the cause of international cooperation, working as the lead advisor for Dean Acheson and David Lilienthal on their 1946 nuclear control proposal, which would never get accepted by the United Nations and, namely, the Soviet Union after it was amended by Truman’s appointed U.N. representative Bernard Baruch to be more favorable to the United States.
In view of the next 50 years of nuclear history — further proliferation, the development of thermonuclear weapons that could be mounted on missiles that were likely impossible to shoot down — the proposals Oppenheimer developed seem utopian: The U.N. would "bring under its complete control world supplies of uranium and thorium," including all mining, and would control all nuclear reactors. This scheme would also make the construction of new weapons impossible, lest other nations build their own.
By the end of 1946, the Baruch proposal had died along with any prospect of international control of nuclear power, all the while the Soviets were working intensely to disrupt America’s nuclear monopoly — with the help of information ferried out of Los Alamos — by successfully testing a weapon before the end of the decade.
With the failure of international arms control and the beginning of the arms race, Oppenheimer’s vision of a post-Trinity world would come to shambles. For Teller, however, it was a great opportunity.
While Oppenheimer planned to stave off nuclear annihilation through international cooperation, Teller was trying to build a bigger deterrent.
Since the early stages of the Manhattan Project, Teller had been dreaming of a fusion weapon many times more powerful than the first atomic bombs, what was then called the “Super.” When the atomic bomb was completed, he would again push for the creation of a thermonuclear bomb, but the efforts stalled thanks to technical and theoretical issues with Teller’s proposed design.
Nolan captures Teller’s early comprehension of just how powerful nuclear weapons can be. In a scene that’s pulled straight from accounts of the Trinity blast, most of the scientists who view the test are either in bunkers wearing welding goggles or following instructions to lie down, facing away from the blast. Not so for Teller. He lathers sunscreen on his face, straps on a pair of dark goggles, and views the explosion straight on, even pursing his lips as the explosion lights up the desert night brighter than the sun.
And it was that power — the sun’s — that Teller wanted to harness in pursuit of his “Super,” where a bomb’s power would be derived from fusing together hydrogen atoms, creating helium — and a great deal of energy. It would even use a fission bomb to help ignite the process.
Oppenheimer and several scientific luminaries, including Manhattan Project scientists Enrico Fermi and Isidor Rabi, opposed the bomb, issuing in their official report on their positions advising the Atomic Energy Commission in 1949 statements that the hydrogen bomb was infeasible, strategically useless, and potentially a weapon of “genocide.”
But by 1950, thanks in part to Teller and the advocacy of Lewis Strauss, a financier turned government official and the approximate villain of Nolan’s film, Harry Truman would sign off on a hydrogen bomb project, resulting in the 1952 “Ivy Mike” test where a bomb using a design from Teller and mathematician Stan Ulam would vaporize the Pacific Island Elugelab with a blast about 700 times more powerful than the one that destroyed Hiroshima.
The success of the project re-ignited doubts around Oppenheimer’s well-known left-wing political associations in the years before the war and, thanks to scheming by Strauss, he was denied a renewed security clearance.
While several Manhattan Project scientists testified on his behalf, Teller did not, saying, “I thoroughly disagreed with him in numerous issues and his actions frankly appeared to me confused and complicated.”
It was the end of Oppenheimer’s public career. The New Deal Democrat had been eclipsed by Teller, who would become the scientific avatar of the Reagan Republicans.
For the next few decades, Teller would stay close to politicians, the military, and the media, exercising a great deal of influence over arms policy for several decades from the Lawrence Livermore National Laboratory, which he helped found, and his academic perch at the University of California.
He pooh-poohed the dangers of radiation, supported the building of more and bigger bombs that could be delivered by longer and longer range missiles, and opposed prohibitions on testing. When Dwight Eisenhower was considering a negotiated nuclear test ban, Teller faced off against future Nobel laureate and Manhattan Project alumnus Hans Bethe over whether nuclear tests could be hidden from detection by conducting them underground in a massive hole; the eventual 1963 test ban treaty would exempt underground testing.
As the Cold War settled into a nuclear standoff with both the United States and the Soviet Union possessing enough missiles and nuclear weapons to wipe out the other, Teller didn’t look to treaties, limitations, and cooperation to solve the problem of nuclear brinksmanship, but instead to space: He wanted to neutralize the threat of a Soviet first strike using x-ray lasers from space powered by nuclear explosions (he was again opposed by Bethe and the x-ray lasers never came to fruition).
He also notoriously dreamed up Project Plowshare, the civilian nuclear project which would get close to nuking out a new harbor in Northern Alaska and actually did attempt to extract gas in New Mexico and Colorado using nuclear explosions.
Yet, in perhaps the strangest turn of all, Teller also became something of a key figure in the history of climate change research, both in his relatively early awareness of the problem and the conceptual gigantism he brought to proposing to solve it.
While publicly skeptical of climate change later in his life, Teller was starting to think about climate change, decades before James Hansen’s seminal 1988 Congressional testimony.
The researcher and climate litigator Benajmin Franta made the startling archival discovery that Teller had given a speech at an oil industry event in 1959 where he warned “energy resources will run short as we use more and more of the fossil fuels,” and, after explaining the greenhouse effect, he said that “it has been calculated that a temperature rise corresponding to a 10 percent increase in carbon dioxide will be sufficient to melt the icecap and submerge New York … I think that this chemical contamination is more serious than most people tend to believe.”
Teller was also engaged with issues around energy and other “peaceful” uses of nuclear power. In response to concerns about the dangers of nuclear reactors, he in the 1960s began advocating putting them underground, and by the early 1990s proposed running said underground nuclear reactors automatically in order to avoid the human error he blamed for the disasters at Chernobyl and Three Mile Island.
While Teller was always happy to find some collaborators to almost throw off an ingenious-if-extreme solution to a problem, there is a strain of “Tellerism,” both institutionally and conceptually, that persists to this day in climate science and energy policy.
Nuclear science and climate science had long been intertwined, Stanford historian Paul Edwards writes, including that the “earliest global climate models relied on numerical methods very similar to those developed by nuclear weapons designers for solving the fluid dynamics equations needed to analyze shock waves produced in nuclear explosions.”
Where Teller comes in is in the role that Lawrence Livermore played in both its energy research and climate modeling. “With the Cold War over and research on nuclear weapons in decline, the national laboratories faced a quandary: What would justify their continued existence?” Edwards writes. The answer in many cases would be climate change, due to these labs’ ample collection of computing power, “expertise in numerical modeling of fluid dynamics, and their skills in managing very large data sets.”
One of those labs was Livermore, the institution founded by Teller, a leading center of climate and energy modeling and research since the late 1980s. “[Teller] was very enthusiastic about weather control,” early climate modeler Cecil “Chuck” Leith told Edwards in an oral history.
The Department of Energy writ large, which inherited much of the responsibilities of the Atomic Energy Commission, is now one of the lead agencies on climate change policy and energy research.
Which brings us to fusion.
It was Teller’s Lawrence Livermore National Laboratory that earlier this year successfully got more power out of a controlled fusion reaction than it put in — and it was Energy Secretary Jennifer Granholm who announced it, calling it the “holy grail” of clean energy development.
Teller’s journey with fusion is familiar to its history: early cautious optimism followed by a realization that it would likely not be achieved soon. As early as 1958, he said in a speech that he had been discussing “controlled fusion” at Los Alamos and that “thermonuclear energy generation is possible,” although he admitted that “the problem is not quite easy” and by 1987 had given up on seeing it realized during his lifetime.
Still, what controlled fusion we do have at Livermore’s National Ignition Facility owes something to Teller and the technology he pioneered in the hydrogen bomb, according to physicist NJ Fisch.
While fusion is one infamous technological fix for the problem of clean and cheap energy production, Teller and the Livermore cadres were also a major influence on the development of solar geoengineering, the idea that global warming could be averted not by reducing the emissions of greenhouse gas into the atmosphere, but by making the sun less intense.
In a mildly trolling column for the Wall Street Journal in January 1998, Teller professed agnosticism on climate change (despite giving that speech to oil executives three decades prior) but proposed an alternative policy that would be “far less burdensome than even a system of market-allocated emissions permits”: solar geoengineering with “fine particles.”
The op-ed placed in the conservative pages of the Wall Street Journal was almost certainly an effort to oppose the recently signed Kyoto Protocol, but the ideas have persisted among thinkers and scientists whose engagement with environmental issues went far beyond their own opinion about Al Gore and by extension the environmental movement as a whole (Teller’s feelings about both were negative).
But his proposal would be familiar to the climate debates of today: particle emissions that would scatter sunlight and thus lower atmospheric temperatures. If climate change had to be addressed, Teller argued, “let us play to our uniquely American strengths in innovation and technology to offset any global warming by the least costly means possible.”
A paper he wrote with two colleagues that was an early call for spraying sulfates in the stratosphere also proposed “deploying electrically-conducting sheeting, either in the stratosphere or in low Earth orbit.” These were “literally diaphanous shattering screens,” that could scatter enough sunlight in order to reduce global warming — one calculation Teller made concludes that 46 million square miles, or about 1 percent of the surface area of the Earth, of these screens would be necessary.
The climate scientist and Livermore alumnus Ken Caldeira has attributed his own initial interest in solar geoengineering to Lowell Wood, a Livermore researcher and Teller protégé. While often seen as a centrist or even a right wing idea in order to avoid the more restrictionist policies on carbon emissions, solar geoengineering has sparked some interest on the left, including in socialist science fiction author Kim Stanley Robinson’s The Ministry for the Future, which envisions India unilaterally pumping sulfates into the atmosphere in response to a devastating heat wave.
The White House even quietly released a congressionally-mandated report on solar geoengineering earlier this spring, outlining avenues for further research.
While the more than 30 years since the creation of the Intergovernmental Panel on Climate Change and the beginnings of Kyoto Protocol have emphasized international cooperation on both science and policymaking through agreed upon goals in emissions reductions, the technological temptation is always present.
And here we can perhaps see that the split between the moralized scientists and their pleas for addressing the problems of the arms race through scientific openness and international cooperation and those of the hawkish technicians, who wanted to press the United States’ technical advantage in order to win the nuclear standoff and ultimately the Cold War through deterrence.
With the IPCC and the United Nations Climate Conference, through which emerged the Kyoto Protocol and the Paris Agreement, we see a version of what the postwar scientists wanted applied to the problem of climate change. Nations come together and agree on targets for controlling something that may benefit any one of them but risks global calamity. The process is informed by scientists working with substantial resources across national borders who play a major role in formulating and verifying the policy mechanisms used to achieve these goals.
But for almost as long as climate change has been an issue of international concern, the Tellerian path has been tempting. While Teller’s dreams of massive sun-scattering sheets, nuclear earth engineering, and automated underground reactors are unlikely to be realized soon, if at all, you can be sure there are scientists and engineers looking straight into the light. And they may one day drag us into it, whether we want to or not.
Editor’s note: An earlier version of this article misstated the name of a climate modeler. It’s been corrected. We regret the error.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Paradise, California, is snatching up high-risk properties to create a defensive perimeter and prevent the town from burning again.
The 2018 Camp Fire was the deadliest wildfire in California’s history, wiping out 90% of the structures in the mountain town of Paradise and killing at least 85 people in a matter of hours. Investigations afterward found that Paradise’s town planners had ignored warnings of the fire risk to its residents and forgone common-sense preparations that would have saved lives. In the years since, the Camp Fire has consequently become a cautionary tale for similar communities in high-risk wildfire areas — places like Chinese Camp, a small historic landmark in the Sierra Nevada foothills that dramatically burned to the ground last week as part of the nearly 14,000-acre TCU September Lightning Complex.
More recently, Paradise has also become a model for how a town can rebuild wisely after a wildfire. At least some of that is due to the work of Dan Efseaff, the director of the Paradise Recreation and Park District, who has launched a program to identify and acquire some of the highest-risk, hardest-to-access properties in the Camp Fire burn scar. Though he has a limited total operating budget of around $5.5 million and relies heavily on the charity of local property owners (he’s currently in the process of applying for a $15 million grant with a $5 million match for the program) Efseaff has nevertheless managed to build the beginning of a defensible buffer of managed parkland around Paradise that could potentially buy the town time in the case of a future wildfire.
In order to better understand how communities can build back smarter after — or, ideally, before — a catastrophic fire, I spoke with Efseaff about his work in Paradise and how other communities might be able to replicate it. Our conversation has been lightly edited and condensed for clarity.
Do you live in Paradise? Were you there during the Camp Fire?
I actually live in Chico. We’ve lived here since the mid-‘90s, but I have a long connection to Paradise; I’ve worked for the district since 2017. I’m also a sea kayak instructor and during the Camp Fire, I was in South Carolina for a training. I was away from the phone until I got back at the end of the day and saw it blowing up with everything.
I have triplet daughters who were attending Butte College at the time, and they needed to be evacuated. There was a lot of uncertainty that day. But it gave me some perspective, because I couldn’t get back for two days. It gave me a chance to think, “Okay, what’s our response going to be?” Looking two days out, it was like: That would have been payroll, let’s get people together, and then let’s figure out what we’re going to do two weeks and two months from now.
It also got my mind thinking about what we would have done going backwards. If you’d had two weeks to prepare, you would have gotten your go-bag together, you’d have come up with your evacuation route — that type of thing. But when you run the movie backwards on what you would have done differently if you had two years or two decades, it would include prepping the landscape, making some safer community defensible space. That’s what got me started.
Was it your idea to buy up the high-risk properties in the burn scar?
I would say I adapted it. Everyone wants to say it was their idea, but I’ll tell you where it came from: Pre-fire, the thinking was that it would make sense for the town to have a perimeter trail from a recreation standpoint. But I was also trying to pitch it as a good idea from a fuel standpoint, so that if there was a wildfire, you could respond to it. Certainly, the idea took on a whole other dimension after the Camp Fire.
I’m a restoration ecologist, so I’ve done a lot of river floodplain work. There are a lot of analogies there. The trend has been to give nature a little bit more room: You’re not going to stop a flood, but you can minimize damage to human infrastructure. Putting levees too close to the river makes them more prone to failing and puts people at risk — but if you can set the levee back a little bit, it gives the flood waters room to go through. That’s why I thought we need a little bit of a buffer in Paradise and some protection around the community. We need a transition between an area that is going to burn, and that we can let burn, but not in a way that is catastrophic.
How hard has it been to find willing sellers? Do most people in the area want to rebuild — or need to because of their mortgages?
Ironically, the biggest challenge for us is finding adequate funding. A lot of the property we have so far has been donated to us. It’s probably upwards of — oh, let’s see, at least half a dozen properties have been donated, probably close to 200 acres at this point.
We are applying for some federal grants right now, and we’ll see how that goes. What’s evolved quite a bit on this in recent years, though, is that — because we’ve done some modeling — instead of thinking of the buffer as areas that are managed uniformly around the community, we’re much more strategic. These fire events are wind-driven, and there are only a couple of directions where the wind blows sufficiently long enough and powerful enough for the other conditions to fall into play. That’s not to say other events couldn’t happen, but we’re going after the most likely events that would cause catastrophic fires, and that would be from the Diablo winds, or north winds, that come through our area. That was what happened in the Camp Fire scenario, and another one our models caught what sure looked a lot like the [2024] Park Fire.
One thing that I want to make clear is that some people think, “Oh, this is a fire break. It’s devoid of vegetation.” No, what we’re talking about is a well-managed habitat. These are shaded fuel breaks. You maintain the big trees, you get rid of the ladder fuels, and you get rid of the dead wood that’s on the ground. We have good examples with our partners, like the Butte Fire Safe Council, on how this works, and it looks like it helped protect the community of Cohasset during the Park Fire. They did some work on some strips there, and the fire essentially dropped to the ground before it came to Paradise Lake. You didn’t have an aerial tanker dropping retardant, you didn’t have a $2-million-per-day fire crew out there doing work. It was modest work done early and in the right place that actually changed the behavior of the fire.
Tell me a little more about the modeling you’ve been doing.
We looked at fire pathways with a group called XyloPlan out of the Bay Area. The concept is that you simulate a series of ignitions with certain wind conditions, terrain, and vegetation. The model looked very much like a Camp Fire scenario; it followed the same pathway, going towards the community in a little gulch that channeled high winds. You need to interrupt that pathway — and that doesn’t necessarily mean creating an area devoid of vegetation, but if you have these areas where the fire behavior changes and drops down to the ground, then it slows the travel. I found this hard to believe, but in the modeling results, in a scenario like the Camp Fire, it could buy you up to eight hours. With modern California firefighting, you could empty out the community in a systematic way in that time. You could have a vigorous fire response. You could have aircraft potentially ready. It’s a game-changing situation, rather than the 30 minutes Paradise had when the Camp Fire started.
How does this work when you’re dealing with private property owners, though? How do you convince them to move or donate their land?
We’re a Park and Recreation District so we don’t have regulatory authority. We are just trying to run with a good idea with the properties that we have so far — those from willing donors mostly, but there have been a couple of sales. If we’re unable to get federal funding or state support, though, I ultimately think this idea will still have to be here — whether it’s five, 10, 15, or 50 years from now. We have to manage this area in a comprehensive way.
Private property rights are very important, and we don’t want to impinge on that. And yet, what a person does on their property has a huge impact on the 30,000 people who may be downwind of them. It’s an unusual situation: In a hurricane, if you have a hurricane-rated roof and your neighbor doesn’t, and theirs blows off, you feel sorry for your neighbor but it’s probably not going to harm your property much. In a wildfire, what your neighbor has done with the wood, or how they treat vegetation, has a significant impact on your home and whether your family is going to survive. It’s a fundamentally different kind of event than some of the other disasters we look at.
Do you have any advice for community leaders who might want to consider creating buffer zones or something similar to what you’re doing in Paradise?
Start today. You have to think about these things with some urgency, but they’re not something people think about until it happens. Paradise, for many decades, did not have a single escaped wildfire make it into the community. Then, overnight, the community is essentially wiped out. But in so many places, these events are foreseeable; we’re just not wired to think about them or prepare for them.
Buffers around communities make a lot of sense, even from a road network standpoint. Even from a trash pickup standpoint. You don’t think about this, but if your community is really strung out, making it a little more thoughtfully laid out also makes it more economically viable to provide services to people. Some things we look for now are long roads that don’t have any connections — that were one-way in and no way out. I don’t think [the traffic jams and deaths in] Paradise would have happened with what we know now, but I kind of think [authorities] did know better beforehand. It just wasn’t economically viable at the time; they didn’t think it was a big deal, but they built the roads anyway. We can be doing a lot of things smarter.
A war of attrition is now turning in opponents’ favor.
A solar developer’s defeat in Massachusetts last week reveals just how much stronger project opponents are on the battlefield after the de facto repeal of the Inflation Reduction Act.
Last week, solar developer PureSky pulled five projects under development around the western Massachusetts town of Shutesbury. PureSky’s facilities had been in the works for years and would together represent what the developer has claimed would be one of the state’s largest solar projects thus far. In a statement, the company laid blame on “broader policy and regulatory headwinds,” including the state’s existing renewables incentives not keeping pace with rising costs and “federal policy updates,” which PureSky said were “making it harder to finance projects like those proposed near Shutesbury.”
But tucked in its press release was an admission from the company’s vice president of development Derek Moretz: this was also about the town, which had enacted a bylaw significantly restricting solar development that the company was until recently fighting vigorously in court.
“There are very few areas in the Commonwealth that are feasible to reach its clean energy goals,” Moretz stated. “We respect the Town’s conservation go als, but it is clear that systemic reforms are needed for Massachusetts to source its own energy.”
This stems from a story that probably sounds familiar: after proposing the projects, PureSky began reckoning with a burgeoning opposition campaign centered around nature conservation. Led by a fresh opposition group, Smart Solar Shutesbury, activists successfully pushed the town to drastically curtail development in 2023, pointing to the amount of forest acreage that would potentially be cleared in order to construct the projects. The town had previously not permitted facilities larger than 15 acres, but the fresh change went further, essentially banning battery storage and solar projects in most areas.
When this first happened, the state Attorney General’s office actually had PureSky’s back, challenging the legality of the bylaw that would block construction. And PureSky filed a lawsuit that was, until recently, ongoing with no signs of stopping. But last week, shortly after the Treasury Department unveiled its rules for implementing Trump’s new tax and spending law, which basically repealed the Inflation Reduction Act, PureSky settled with the town and dropped the lawsuit – and the projects went away along with the court fight.
What does this tell us? Well, things out in the country must be getting quite bleak for solar developers in areas with strident and locked-in opposition that could be costly to fight. Where before project developers might have been able to stomach the struggle, money talks – and the dollars are starting to tell executives to lay down their arms.
The picture gets worse on the macro level: On Monday, the Solar Energy Industries Association released a report declaring that federal policy changes brought about by phasing out federal tax incentives would put the U.S. at risk of losing upwards of 55 gigawatts of solar project development by 2030, representing a loss of more than 20 percent of the project pipeline.
But the trade group said most of that total – 44 gigawatts – was linked specifically to the Trump administration’s decision to halt federal permitting for renewable energy facilities, a decision that may impact generation out west but has little-to-know bearing on most large solar projects because those are almost always on private land.
Heatmap Pro can tell us how much is at stake here. To give you a sense of perspective, across the U.S., over 81 gigawatts worth of renewable energy projects are being contested right now, with non-Western states – the Northeast, South and Midwest – making up almost 60% of that potential capacity.
If historical trends hold, you’d expect a staggering 49% of those projects to be canceled. That would be on top of the totals SEIA suggests could be at risk from new Trump permitting policies.
I suspect the rate of cancellations in the face of project opposition will increase. And if this policy landscape is helping activists kill projects in blue states in desperate need of power, like Massachusetts, then the future may be more difficult to swallow than we can imagine at the moment.
And more on the week’s most important conflicts around renewables.
1. Wells County, Indiana – One of the nation’s most at-risk solar projects may now be prompting a full on moratorium.
2. Clark County, Ohio – Another Ohio county has significantly restricted renewable energy development, this time with big political implications.
3. Daviess County, Kentucky – NextEra’s having some problems getting past this county’s setbacks.
4. Columbia County, Georgia – Sometimes the wealthy will just say no to a solar farm.
5. Ottawa County, Michigan – A proposed battery storage facility in the Mitten State looks like it is about to test the state’s new permitting primacy law.