You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
Most nonprofit boards can do whatever they want.

Surely you’ve heard by now. On Friday, the board of directors of OpenAI, the world-bestriding startup at the center of the new artificial intelligence boom, fired its chief executive, Sam Altman. He had not been “consistently candid” with the board, the company said, setting in motion a coup — and potential counter-coup — that has transfixed the tech, business, and media industries for the past 72 hours.
OpenAI is — was? — a strange organization. Until last week, it was both the country’s hottest new tech company and an independent nonprofit devoted to ensuring that a hypothetical, hyper-intelligent AI “benefits all of humanity.” The nonprofit board owned and controlled the for-profit startup, but it did not fund it entirely; the startup could and did accept outside investment, such as a $13 billion infusion from Microsoft.
This kind of dual nonprofit/for-profit structure isn’t uncommon in the tech industry. The encrypted messaging app Signal, for instance, is owned by a foundation, as is the company that makes the cheap, programmable microchip Raspberry Pi. The open-source browser Firefox is overseen by the Mozilla Foundation.
But OpenAI’s structure is unusually convoluted, with two nested holding companies and a growing split between who was providing the money (Microsoft) and who ostensibly controlled operations (the nonprofit board). That tension between the nonprofit board and the for-profit company is what ultimately ripped apart OpenAI, because when the people with control (the board) tried to fire Altman, the people with the money (Microsoft) said no. As I write this, Microsoft seems likely to win.
This may all seem remote from what we cover here at Heatmap. Other than the fact that ChatGPT devours electricity, OpenAI doesn’t obviously have anything to do with climate change, electric vehicles, or the energy transition. Sometimes I even have the sense that many climate advocates take a certain delight in high-profile AI setbacks, because they resent competing with it for existential-risk airtime.
Yet OpenAI’s schism is a warning for climate world. Strip back the money, the apocalypticism, the big ideas and Terminator references, and OpenAI is fundamentally a story about nonprofit governance. When a majority of the board decided to knock Altman from his perch, nobody could stop them. They alone decided to torch $80 billion in market value overnight and set their institution on fire. Whether that was the right or wrong choice, it illustrates how nonprofit organizations — especially those that, like OpenAI, are controlled solely by a board of directors — act with an unusual amount of arbitrary authority.
Why does that matter for the climate or environmental movement? Because the climate and energy world is absolutely teeming with nonprofit organizations — and many of them are just as unconstrained, just as willfully wacky, as OpenAI.
Get one great climate story in your inbox every day:
Let’s step back. Nonprofits can generally be governed in two ways. (Apologies to nonprofit lawyers in the audience: I’m about to vastly simplify your specialty.) The first is a chapter- or membership-driven structure, in which a mass membership elects leaders to serve on a board of directors. Many unions, social clubs, and business groups take this form: Every few years, the members elect a new president or board of directors, who lead the organization for the next few years.
The other way is a so-called “board-only” organization. In this structure, the nonprofit’s board of directors leads the organization and does not answer to a membership or chapter. (There is often no membership to answer to.) When a vacancy opens up on the board, its remaining members appoint a replacement, perpetuating itself over time.
OpenAI was just such a board-only organization. Even though Altman was CEO, OpenAI was led officially by its board of directors.
This is a stranger way of running an organization than it may seem. For a small, private foundation, it may work just fine: Such an organization has no staff and probably meets rarely. (Most U.S. nonprofits are just this sort of organization.) But when a board-only nonprofit gets big — when it fulfills a crucial public purpose or employs hundreds or thousands of people — it faces an unusual lack of institutional constraints.
Consider, for instance, what life is like for a decently sized business, a small government agency, and a medium-sized nonprofit. The decently sized business is constantly buffeted by external forcing factors. Its creditors need to be repaid; it is battling for market share and product position. It faces market discipline or at least some kind of profit motive. It has to remain focused, competitive, and at least theoretically efficient.
The government agency, meanwhile, is constrained by public scrutiny and political oversight. Its bureaucrats and public servants are managed by elected officials, who are themselves accountable to the public. When a particularly important agency is not doing its job, voters can demand a change or elect new leadership.
Nonprofits can have some of the same built-in checks and balances — but only when they are controlled by members, and not by a board. If a members association embarrasses itself, for instance, or if it doesn’t carry out its mission, then its membership can vote out the board and elect new directors to replace them. But stakeholders have no such recourse for a board-only nonprofit. Insulated from market pressure and public oversight, board-only nonprofits are free to wander off into wackadoodle land.
The problem is that board-only nonprofits are only becoming more powerful — in fact, many of the nonprofits you know best are probably controlled solely by their board. In 2002, the Harvard political scientist Theda Skocpol observed that American civic life had undergone a rapid transformation: where it had once been full of membership-driven federations, such as the Lions Club or the League of Women Voters, it was now dominated by issues-focused advocacy groups.
From the late 19th to the mid-20th century, she wrote, America “had a uniquely balanced civic life, in which markets expanded but could not subsume civil society, in which governments at multiple levels deliberately and indirectly encouraged federated voluntary associations.” But from the 1960s to the 1990s, that old network fell apart. It was “bypassed and shoved to the side by a gaggle of professionally dominated advocacy groups and nonprofit institutions rarely attached to memberships worthy of the name,” Skocpol wrote.
The sheer number of groups exploded. In 1958, the Encyclopedia of Associations listed approximately 6,500 associations, Skocpol writes. By 1990, that number had more than tripled to 23,000. Today, the American Society of Association Executives — which is, just so we’re clear here, literally an association for associations — counts almost 1.9 million associations, including 1.2 million nonprofits.
This new network includes some nonprofits that claim to have members but are not in fact governed by them, such as the AARP. It includes “public citizen” or legal-advocacy groups, which watchdog legislation or fight for important precedents in the courts, such as Earthjustice, the Center for Biological Diversity, or Public Citizen itself. And it includes independent, mission-driven, and board-controlled nonprofits — such as OpenAI.
There is nothing wrong with these new groups per se. Many of them are inspired by the advocacy and legal organizations that won some of the Civil Rights Movement’s biggest victories. But unlike the member federations and civic associations that they largely replaced, these new groups don’t force Americans to engage with what their neighbors are thinking and feeling. So they “compartmentalize” America, in Skocpol’s words. Instead of articulating the views of a deep, national membership network, these groups essentially speak for a centralized and professionalized leadership corps — invariably located in a major city — who are armed with modern marketing techniques. And instead of fundraising through dues, fees, or tithes, these new groups depend on direct-mail operations, massive ad campaigns, and foundation grants.
This is the organizational superstructure on which much of the modern climate movement rests. When you read a climate news story, someone quoted in it will probably work for such a nonprofit. Many climate and energy policy experts spend at least part of their careers at some kind of nonprofit. Most climate or environmental news outlets — although not this one — are funded in whole or part through donations and foundation grants. And most climate initiatives that earn mainstream attention receive grants from a handful of foundations.
There is nothing necessarily wrong with this setup — and, of course, an equivalent network devoted to stopping and delaying climate policy exists to rival it on the right. But the entire design places an enormous amount of faith in the leaders of these nonprofits and foundations, and in the social strata that they occupy. If a nonprofit messes up, then only public attention or press coverage can right the ship. And there is simply not enough of either resource to keep these things on track.
That leads to odd resource allocation decisions, business units that seem to have no purpose (alongside teams that seem perpetually overworked), and decisions that frame otherwise decent policies in politically unpalatable ways. It regularly burns out people involved in climate organizations. And it means that much of the climate movement’s strategy is controlled by foundation officials and nonprofit directors. Like any other group of executives, these people are capable of deluding themselves about what is happening in the world; unlike other types of leaders, however, they face neither an angry electorate nor a ruthless market that will force them to update their worldview. The risk exists, then, that they could blunder into disaster — and take the climate movement with them.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
“Additionality” is back.
You may remember “additionality” from such debates as, “How should we structure the hydrogen tax credit?”
Well, it’s back, this time around Meta’s massive investment in nuclear power.
On January 9, the hyperscaler announced that it would be continuing to invest in the nuclear business. The announcement went far beyond its deal last year to buy power from a single existing plant in Illinois and embraced a smorgasbord of financial and operational approaches to nukes. Meta will buy the output for 20 years from two nuclear plants in Ohio, it said, including additional power from increased capacity that will be installed at the plants (as well as additional power from a nuclear plant in Pennsylvania), plus work on developing new, so-far commercially unproven designs from nuclear startups Oklo and TerraPower. All told, this could add up to 6.6 gigawatts of clean, firm power.
Sounds good, right?
Well, the question is how exactly to count that power. Over 2 gigawatts of that capacity is already on the grid from the two existing power plants, operated by Vistra. There will also be an “additional 433 megawatts of combined power output increases” from the existing power plants, known as “uprates,” Vistra said, plus another 3 gigawatts at least from the TerraPower and Oklo projects, which are aiming to come online in the 2030s
Princeton professor and Heatmap contributor Jesse Jenkins cried foul in a series of posts on X and LinkedIn responding to the deal, describing it as “DEEPLY PROBLEMATIC.”
“Additionality” means that new demand should be met with new supply from renewable or clean power. Assuming that Meta wants to use that power to serve additional new demand from data centers, Jenkins argued that “the purchase of 2.1 gigawatts of power … from two EXISTING nuclear power plants … will do nothing but increase emissions AND electricity rates” for customers in the area who are “already grappling with huge bill increases, all while establishing a very dangerous precedent for the whole industry.”
Data center demand is already driving up electricity prices — especially in the area where Meta is signing these deals. Customers in the PJM Interconnection electricity grid, which includes Ohio, have paid $47 billion to ensure they have reliable power over the grid operator’s last three capacity auctions. At least $23 billion of that is attributable to data center usage, according to the market’s independent monitor.
“When a huge gigawatt-scale data center connects to the grid,” Jenkins wrote, “it's like connecting a whole new city, akin to plopping down a Pittsburgh or even Chicago. If you add massive new demand WITHOUT paying for enough new supply to meet that growth, power prices spike! It's the simple law of supply & demand.”
And Meta is investing heavily in data centers within the PJM service area, including its Prometheus “supercluster” in New Albany, Ohio. The company called out this facility in its latest announcement, saying that the suite of projects “will deliver power to the grids that support our operations, including our Prometheus supercluster in New Albany, Ohio.”
The Ohio project has been in the news before and is planning on using 400 megawatts of behind-the-meter gas power. The Ohio Power Siting Board approved 200 megawatts of new gas-fired generation in June.
This is the crux of the issue for Jenkins: “Data centers must pay directly for enough NEW electricity capacity and energy to meet their round-the-clock needs,” he wrote. This power should be clean, both to mitigate the emissions impact of new demand and to meet the goals of hyperscalers, including Meta, to run on 100% clean power (although how to account for that is a whole other debate).
While hyperscalers like Meta still have clean power goals, they have been more sotto voce recently as the Trump administration wages war on solar and wind. (Nuclear, on the other hand, is very much administration approved — Secretary of Energy Chris Wright was at Meta’s event announcing the new nuclear deal.)
Microsoft, for example, mentioned the word “clean” just once in its Trump-approved “Building Community-First AI Infrastructure” manifesto, released Tuesday, which largely concerned how it sought to avoid electricity price hikes for retail customers and conserve water.
It’s not entirely clear that Meta views the entirety of these deals — the power purchase agreements, the uprates, financially supporting the development of new plants — as extra headroom to expand data center development right now. For one, Meta at least publicly claims to care about additionality. Meta’s own public-facing materials describing its clean energy commitments say that a “fundamental tenet of our approach to clean and renewable energy is the concept of additionality: partnering with utilities and developers to add new projects to the grid.”
And it’s already made substantial deals for new clean energy in Ohio. Last summer, Meta announced a deal with renewable developer Invenergy to procure some 440 megawatts of solar power in the state by 2027, for a total of 740 megawatts of renewables in Ohio. So Meta and Jenkins may be less far apart than they seem.
There may well be value in these deals from a sustainability and decarbonization standpoint — not to mention a financial standpoint. Some energy experts questioned Jenkins’ contention that Meta was harming the grid by contracting with existing nuclear plants.
“Based on what I know about these arrangements, they don’t see harm to the market,” Jeff Dennis, a former Department of Energy official who’s now executive director of the Electricity Customer Alliance, an energy buyers’ group that includes Meta, told me.
In power purchase agreements, he said, “the parties are contracting for price and revenue certainty, but then the generator continues to offer its supply into the energy and capacity markets. So the contracting party isn’t siphoning off the output for itself and creating or exacerbating a scarcity situation.”
The Meta deal stands in contrast to the proposed (and later scotched) deal between Amazon and Talen Energy, which would have co-located a data center at the existing Susquehanna nuclear plant and sucked capacity out of PJM.
Dennis said he didn’t think Meta’s new deals would have “any negative impact on prices in PJM” because the plants would be staying in the market and on the grid.
Jenkins praised the parts of the Meta announcement that were both clean and additional — that is, the deals with TerraPower and Oklo, plus the uprates from existing nuclear plants.
“That is a huge purchase of NEW clean supply, and is EXACTLY what hyperscalars [sic] and other large new electricity users should be doing,” Jenkins wrote. “Pay to bring new clean energy online to match their growing demand. That avoids raising rates for other electricity users and ensures new demand is met by new clean supply. Bravo!”
But Dennis argued that you can’t neatly separate out the power purchase agreement for the existing output of the plants and the uprates. It is “reasonable to assume that without an agreement that shores up revenues for their existing output and for maintenance and operation of that existing infrastructure, you simply wouldn't get those upgrades and 500 megawatts of upgrades,” he told me.
There’s also an argument that there’s real value — to the grid, to Meta, to the climate — to giving these plants 20 years of financial certainty. While investment is flooding into expanding and even reviving existing nuclear plants, they don’t always fare well in wholesale power markets like PJM, and saw a rash of plant retirements in the 2010s due to persistently low capacity and energy prices. While the market conditions are now quite different, who knows what the next 20 years might bring.
“From a pure first order principle, I agree with the additionality criticism,” Ethan Paterno, a partner at PA Consulting, an innovation advisory firm, told me. “But from a second or third derivative in the Six Degrees of Kevin Bacon, you can make the argument that the hyperscalers are keeping around nukes that perhaps might otherwise be retired due to economic pressure.”.
Ashley Settle, a Meta spokesperson, told me that the deals “enable the extension of the operational lifespan and increase of the energy production at three facilities.” Settle did not respond, however, when asked how Facebook would factor the deals into its own emissions accounting.
“The only way I see this deal as acceptable,” Jenkins wrote, “is if @Meta signed a PPA with the existing reactors only as a financial hedge & to help unlock the incremental capacity & clean energy from uprates at those plants, and they are NOT counting the capacity or energy attributes from the existing capacity to cover new data center demand.”
There’s some hint that Meta may preserve the additionality concept of matching only new supply with demand, as the announcement refers to “new additional uprate capacity,” and says that “consumers will benefit from a larger supply of reliable, always-ready power through Meta-supported uprates to the Vistra facilities.” The text also refers to “additional 20-year nuclear energy agreements,” however, which would likely not meet strict definitions of additionality as it refers to extending the lifetime and maintaining the output of already existing plants.
A third judge rejected a stop work order, allowing the Coastal Virginia offshore wind project to proceed.
Offshore wind developers are now three for three in legal battles against Trump’s stop work orders now that Dominion Energy has defeated the administration in federal court.
District Judge Jamar Walker issued a preliminary injunction Friday blocking the stop work order on Dominion’s Coastal Virginia offshore wind project after the energy company argued it was issued arbitrarily and without proper basis. Dominion received amicus briefs supporting its case from unlikely allies, including from representatives of PJM Interconnection and David Belote, a former top Pentagon official who oversaw a military clearinghouse for offshore wind approval. This comes after Trump’s Department of Justice lost similar cases challenging the stop work orders against Orsted’s Revolution Wind off the coast of New England and Equinor’s Empire Wind off New York’s shoreline.
As for what comes next in the offshore wind legal saga, I see three potential flashpoints:
It’s important to remember the stakes of these cases. Orsted and Equinor have both said that even a week or two more of delays on one of these projects could jeopardize their projects and lead to cancellation due to narrow timelines for specialized ships, and Dominion stated in the challenge to its stop work order that halting construction may cost the company billions.
It’s aware of the problem. That doesn’t make it easier to solve.
The data center backlash has metastasized into a full-blown PR crisis, one the tech sector is trying to get out in front of. But it is unclear whether companies are responding effectively enough to avoid a cascading series of local bans and restrictions nationwide.
Our numbers don’t lie: At least 25 data center projects were canceled last year, and nearly 100 projects faced at least some form of opposition, according to Heatmap Pro data. We’ve also recorded more than 60 towns, cities and counties that have enacted some form of moratorium or restrictive ordinance against data center development. We expect these numbers to rise throughout the year, and it won’t be long before the data on data center opposition is rivaling the figures on total wind or solar projects fought in the United States.
I spent this week reviewing the primary motivations for conflict in these numerous data center fights and speaking with representatives of the data center sector and relevant connected enterprises, like electrical manufacturing. I am now convinced that the industry knows it has a profound challenge on its hands. Folks are doing a lot to address it, from good-neighbor promises to lobbying efforts at the state and federal level. But much more work will need to be done to avoid repeating mistakes that have bedeviled other industries that face similar land use backlash cycles, such as fossil fuel extraction, mining, and renewable energy infrastructure development.
Two primary issues undergird the data center mega-backlash we’re seeing today: energy use fears and water consumption confusion.
Starting with energy, it’s important to say that data center development currently correlates with higher electricity rates in areas where projects are being built, but the industry challenges the presumption that it is solely responsible for that phenomenon. In the eyes of opponents, utilities are scrambling to construct new power supplies to meet projected increases in energy demand, and this in turn is sending bills higher.
That’s because, as I’ve previously explained, data centers are getting power in two ways: off the existing regional electric grid or from on-site generation, either from larger new facilities (like new gas plants or solar farms) or diesel generators for baseload, backup purposes. But building new power infrastructure on site takes time, and speed is the name of the game right now in the AI race, so many simply attach to the existing grid.
Areas with rising electricity bills are more likely to ban or restrict data center development. Let’s just take one example: Aurora, Illinois, a suburb of Chicago and the second most-populous city in the state. Aurora instituted a 180-day moratorium on data center development last fall after receiving numerous complaints about data centers from residents, including a litany related to electricity bills. More than 1.5 gigawatts of data center capacity already operate in the surrounding Kane County, where residential electricity rates are at a three-year high and expected to increase over the near term – contributing to a high risk of opposition against new projects.
The second trouble spot is water, which data centers need to cool down their servers. Project developers have face a huge hurdle in the form of viral stories of households near data centers who suddenly lack a drop to drink. Prominent examples activists bring up include this tale of a family living next to a Meta facility in Newton County, Georgia, and this narrative of people living around an Amazon Web Services center in St. Joseph County, Indiana. Unsurprisingly, the St. Joseph County Council rejected a new data center in response to, among other things, very vocal water concerns. (It’s worth noting that the actual harm caused to water systems by data centers is at times both over- and under-stated, depending on the facility and location.)
“I think it’s very important for the industry as a whole to be honest that living next to [a data center] is not an ideal situation,” said Caleb Max, CEO of the National Artificial Intelligence Association, a new D.C.-based trade group launched last year that represents Oracle and myriad AI companies.
Polling shows that data centers are less popular than the use of artificial intelligence overall, Max told me, so more needs to be done to communicate the benefits that come from their development – including empowering AI. “The best thing the industry could start to do is, for the people in these zip codes with the data centers, those people need to more tangibly feel the benefits of it.”
Many in the data center development space are responding quickly to these concerns. Companies are clearly trying to get out ahead on energy, with the biggest example arriving this week from Microsoft, which pledged to pay more for the electricity it uses to power its data centers. “It’s about balancing that demand and market with these concerns. That’s why you're seeing the industry lean in on these issues and more proactively communicating with communities,” said Dan Diorio, state policy director for the Data Center Coalition.
There’s also an effort underway to develop national guidance for data centers led by the National Electrical Manufacturers Association, the American Society of Heating, Refrigerating, and Air-Conditioning Engineers, and the Pacific Northwest National Laboratory, expected to surface publicly by this summer. Some of the guidance has already been published, such as this document on energy storage best practices, which is intended to help data centers know how to properly use solutions that can avoid diesel generators, an environmental concern in communities. But the guidance will ultimately include discussions of cooling, too, which can be a water-intensive practice.
“It’s a great example of an instance where industry is coming together and realizing there’s a need for guidance. There’s a very rapidly developing sector here that uses electricity in a fundamentally different way, that’s almost unprecedented,” Patrick Hughes, senior vice president of strategy, technical, and industry affairs for NEMA, told me in an interview Monday.
Personally, I’m unsure whether these voluntary efforts will be enough to assuage the concerns of local officials. It certainly isn’t convincing folks like Jon Green, a member of the Board of Supervisors in Johnson County, Iowa. Johnson County is a populous area, home to the University of Iowa campus, and Green told me that to date it hasn’t really gotten any interest from data center developers. But that didn’t stop the county from instituting a one-year moratorium in 2025 to block projects and give time for them to develop regulations.
I asked Green if there’s a form of responsible data center development. “I don’t know if there is, at least where they’re going to be economically feasible,” he told me. “If we say they’ve got to erect 40 wind turbines and 160 acres of solar in order to power a data center, I don’t know if when they do their cost analysis that it’ll pencil out.”