You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
Why regional transmission organizations as we know them might not survive the data center boom.

As the United States faces its first significant increase in electricity demand in decades, the grid itself is not only aging, but also straining against the financial, logistical, and legal barriers to adding new supply. It’s enough to make you wonder: What’s the point of an electricity market, anyway?
That’s the question some stakeholders in the PJM Interconnection, America’s largest electricity market, started asking loudly and in public in response to the grid operator’s proposal that new large energy users could become “non-capacity backed load,” i.e. be forced to turn off if ever and whenever PJM deems it necessary.
PJM, which covers 13 states from the Mid-Atlantic to the Midwest, has been America’s poster child for the struggle to get new generation online as data center development surges. PJM has warned that it will have “just enough generation to meet its reliability requirement” in 2026 and 2027, and its independent market monitor has said that the costs associated with serving that new and forecast demand have already reached the billions, translating to higher retail electricity rates in several PJM states.
As Heatmap has covered, however, basically no one in the PJM system — transmission owners, power producers, and data center developers — was happy with the details of PJM’s plan to deal with the situation. In public comments on the proposed rule, many brought up a central conflict between utilities’ historic duty to serve and the realities of the modern power market. More specifically, electricity markets like PJM are supposed to deal with wholesale electricity sales, not the kind of core questions of who gets served and when, which are left to the states.
On the power producer side, major East Coast supplier Talen Energy wrote, “The NCBL proposal exceeds PJM’s authority by establishing a regime where PJM holds the power to withhold electric service unlawfully from certain categories of large load.” The utility Exelon added that owners of transmission “have a responsibility to serve all customers—large, small, and in between. We are obligated to provide both retail and wholesale electric service safely and reliably.” And last but far from least, Microsoft, which has made itself into a leader in artificial intelligence, argued, “A PJM rule curtailing non-capacity-backed load would not only unlawfully intrude on state authority, but it would also fundamentally undercut the very purpose of PJM’s capacity market.”
This is just one small piece of a debate that’s been heating up for years, however, as more market participants, activists, and scholars question whether the markets that govern much of the U.S. electric grid are delivering power as cheaply and abundantly as they were promised to. Some have even suggested letting PJM utilities build their own power plants again, effectively reversing the market structure of the past few decades.
But questioning whether all load must be served would be an even bigger change.
The “obligation to serve all load has been a core tenet of electricity policy,” Rob Gramlich, the president of Grid Strategies LLC, told me. “I don’t recall ever seeing that be questioned or challenged in any fundamental way” — an illustration of how dire things have become.
The U.S. electricity system was designed for abundance. Utilities would serve any user, and the per-user costs of developing the fixed infrastructure necessary to serve them would drop as more users signed up.
But the planned rush of data center investments threatens to stick all ratepayers with the cost of new transmission and generation that is overwhelmingly from one class of customer. There is already a brewing local backlash to new data centers, and electricity prices have been rising faster than inflation. New data center load could also have climate consequences if utilities decide to leave aging coal online and build out new natural gas-fired power plants over and above their pre-data center boom (and pre-Trump) plans.
“AI has dramatically raised the stakes, along with enhancing worries that heightened demand will mean more burning of fossil fuels,” law professors Alexandra Klass of the University of Michigan and Dave Owen at the University of California write in a preprint paper to be published next year.
In an interview, Klass told me, “There are huge economic and climate implications if we build a whole lot of gas and keep coal on, and then demand is lower because the chips are better,” referring to the possibility that data centers and large language models could become dramatically more energy efficient, rendering the additional fossil fuel-powered supply unnecessary. Even if the projects are not fully built out or utilized, the country could face a situation where “ratepayers have already paid for [grid infrastructure], whether it’s through those wholesale markets or through their utilities in traditionally regulated states,” she said.
The core tension between AI development and the power grid, Klass and Owen argue, is the “duty to serve,” or “universal service” principle that has underlain modern electricity markets for over a century.
“The duty to serve — to meet need at pretty much all times — worked for utilities because they got to pass through their costs, and it largely worked for consumers because they didn’t have to deal very often with unpredictable blackouts,” Owen told me.
“Once you knew how to build transmission lines and build power plants,” Klass added, “there was no sense that you couldn’t continue to build to serve all customers. “We could build power plants, and the regulatory regime came up in a context where we could always build enough to meet demand.”
How and why goes back to the earliest days of electrification.
As the power industry developed in the late 19th and early 20th century, the regulated utility model emerged where monopoly utilities would build both power plants and the transmission and distribution infrastructure necessary to serve that power to customers. So that they would be able to achieve the economies of scale required to serve said customers efficiently and affordably, regulators allowed them to establish monopolies over certain service territories, with the requirement that they would serve any and everyone in them.
With a secure base of ratepayers, utilities could raise money from investors to build infrastructure, which could then be put into a “rate base” and recouped from ratepayers over time at a fixed return. In exchange, the utilities accepted regulation from state governments over their pricing and future development trajectories.
That vertically integrated system began to crack, however, as ratepayers revolted over high costs from capital investments by utilities, especially from nuclear power plants. Following the deregulation of industries such as trucking and air travel, federal regulators began to try to break up the distribution and generation portions of the electricity industry. In 1999, after some states and regions had already begun to restructure their electricity markets, the Federal Energy Regulatory Commission encouraged the creation of regional transmission organizations like PJM.
Today some 35 state electricity markets are partially or entirely restructured, with Texas operating its own, isolated electricity market beyond the reach of federal regulation. In PJM and other RTOs, electricity is (more or less) sold competitively on a wholesale basis by independent power producers to utilities, who then serve customers.
But the system as it’s constructed now may, critics argue, expose retail customers to unacceptable cost increases — and greenhouse gas emissions — as it attempts to grapple with serving new data center load.
Klass and Owen, for their part, point to other markets as models for how electricity could work that don’t involve the same assumptions of plentiful supply that electricity markets historically have, such as those governing natural gas or even Western water rights.
Interruptions of natural gas service became more common starting in the 1970s, when some natural gas services were underpriced thanks to price caps, leading to an imbalance between supply and demand. In response, regulators “established a national policy of curtailment based on end use,” Klass and Owen write, with residential users getting priority “because of their essential heating needs, followed by firm industrial and commercial customers, and finally, interruptible customers.” Natural gas was deregulated in the late 1970s and 1980s, with curtailment becoming more market-based, which also allowed natural gas customers to trade capacity with each other.
Western water rights, meanwhile, are notoriously opaque and contested — but, importantly, they are based on scarcity, and thus may provide lessons in an era of limited electricity supply. The “prior appropriation” system water markets use is, “at its core, a set of mechanisms for allocating shortage,” the authors write. Water users have “senior” and “junior” rights, with senior users “entitled to have their rights fulfilled before the holders of newer, or more ’junior,’ water rights.” These rights can be transferred, and junior users have found ways to work with what water they can get, with the authors citing extensive conservation efforts in Southern California compared to the San Francisco Bay area, which tends to have more senior rights.
With these models in mind, Klass and Owen propose a system called “demand side connect-and-manage,” whereby new loads would not necessarily get transmission and generation service at all times, and where utilities could curtail users and electricity customers would have the ability “to use trading to hedge against the risk of curtailments.”
“We can connect you now before we build a whole lot of new generation, but when we need to, we’re going to curtail you,” Klass said, describing her and Owen’s proposal.
Tyler Norris, a Duke University researcher who has published concept-defining work on data center flexibility, called the paper “one of the most important contributions yet toward the re-examination of basic assumptions of U.S. electricity law that’s urgently needed as hyperscale load growth pushes our existing regulatory system beyond its limits.”
While electricity may not be literally drying up, he told me, “when you are supply side constrained while demand is growing, you have this challenge of, how do you allocate scarcity?”
Unlike the PJM proposals, “Our paper was very focused on state law,” Klass told me. “And that was intentional, because I think this is trickier at the federal level,” she told me.
Some states are already embracing similar ideas. Ohio regulators, for instance, established a data center tariff that tries to protect customers from higher costs by forcing data centers to make minimum payments regardless of their actual electricity use. Texas also passed a law that would allow for some curtailment of large loads and reforms of the interconnection process to avoid filling up the interconnection queue with speculative projects that could result in infrastructure costs but not real electricity demand.
Klass and Owen write that their idea may be more of “a temporary bridging strategy, primarily for periods when peak demand outstrips supply or at least threatens to do so.”
Even those who don’t think the principles underlying electricity markets need to be rethought see the need — at least in the short term — for new options for large new power users who may not get all the power they want all of the time.
“Some non-firm options are necessary in the short term,” Gramlich told me, referring to ideas like Klass and Owen’s, Norris’s, and PJM’s. “Some of them are going to have some legal infirmities and jurisdictional problems. But I think no matter what, we’re going to see some non-firm options. A lot of customers, a lot of these large loads, are very interested, even if it’s a temporary way to get connected while they try to get the firm service later.”
If electricity markets have worked for over one hundred years on the principle that more customers could bring down costs for everyone, going forward, we may have to get more choosy — or pay the price.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Microsoft dominated this year.
It’s been a quiet year for carbon dioxide removal, the nascent industry trying to lower the concentration of carbon already trapped in the atmosphere.
After a stretch as the hottest thing in climate tech, the CDR hype cycle has died down. 2025 saw fewer investments and fewer big projects or new companies announced.
This story isn’t immediately apparent if you look at the sales data for carbon removal credits, which paints 2025 as a year of breakout growth. CDR companies sold nearly 30 million tons of carbon removal, according to the leading industry database, CDR.fyi — more than three times the amount sold in 2024. But that topline number hides a more troubling reality — about 90% of those credits were bought by a single company: Microsoft.
If you exclude Microsoft, the total volume of carbon removal purchased this year actually declined by about 100,000 tons. This buyer concentration is the continuation of a trend CDR.fyi observed in its 2024 Year In Review report, although non-Microsoft sales had grown a bit that year compared to 2023.
Trump’s crusade against climate action has likely played a role in the market stasis of this year. Under the Biden administration, federal investment in carbon removal research, development, and deployment grew to new heights. Biden’s Securities and Exchange Commission was also getting ready to require large companies to disclose their greenhouse gas emissions and climate targets, a move that many expected to increase demand for carbon credits. But Trump’s SEC scrapped the rule, and his agency heads have canceled most of the planned investments. (At the time of publication, the two direct air capture projects that Biden’s Department of Energy selected to receive up to $1.2 billion have not yet had their contracts officially terminated, despite both showing up on a leaked list of DOE grant cancellations in October.)
Trump’s overall posture on climate change reduced pressure on companies to act, which probably contributed to there being fewer new buyers entering the carbon removal market, Robert Hoglund, a carbon removal advisor who co-founded CDR.fyi, told me. “I heard several companies say that, yeah, we wouldn't have been able to do this commitment this year. We're glad that we made it several years ago,” he told me.
Kyle Harrison, a carbon markets analyst at BloombergNEF, told me he didn’t view Microsoft’s dominance in the market as a bad sign. In the early days of corporate wind and solar energy contracts, he said, Microsoft, Google, and Amazon were the only ones signing deals, which raised similar questions about the sustainability of the market. “But what it did is it created a blueprint for how you sign these deals and make these nascent technologies more financeable, and then it brings down the cost, and then all of a sudden, you start to get a second generation of companies that start to sign these deals.”
Harrison expects the market to see slower growth in the coming years until either carbon removal companies are able to bring down costs or a more reliable regulatory signal puts pressure on buyers.
Governments in Europe and the United Kingdom introduced a few weak-ish signals this year. The European Union continued to advance a government certification program for carbon removal and expects to finalize methodologies for several CDR methods in 2026. That government stamp of approval may give potential buyers more confidence in the market.
The EU also announced plans to set up a carbon removal “buyers’ club” next year to spur more demand for CDR by pooling and coordinating procurement, although the proposal is light on detail. There were similar developments in the United Kingdom, which announced a new “contract for differences” policy through which the government would finance early-stage direct air capture and bioenergy with carbon capture projects.
A stronger signal, though, could eventually come from places with mandatory emissions cap and trade policies, such as California, Japan, China, the European Union, or the United Kingdom. California already allows companies to use carbon removal credits for compliance with its cap and invest program. The U.K. plans to begin integrating CDR into its scheme in 2029, and the EU and Japan are considering when and how to do the same.
Giana Amador, the executive director of the U.S.-based Carbon Removal Alliance, told me these demand pulls were extremely important. “It tells investors, if you invest in this today, in 10 years, companies will be able to access those markets,” she said.
At the same time, carbon removal companies are not going to be competitive in any of these markets until carbon trades at a substantially higher price, or until companies can make carbon removal less expensive. “We need to both figure out how we can drive down the cost of carbon removal and how to make these carbon removal solutions more effective, and really kind of hone the technology. Those are what is going to unlock demand in the future,” she said.
There’s certainly some progress being made on that front. This year saw more real-world deployments and field tests. Whereas a few years ago, the state of knowledge about various carbon removal methods was based on academic studies of modeling exercises or lab experiments, now there’s starting to be a lot more real-world data. “For me, that is the most important thing that we have seen — continued learning,” Hoglund said.
There’s also been a lot more international interest in the sector. “It feels like there’s this global competition building about what country will be the leader in the industry,” Ben Rubin, the executive director of the Carbon Business Council, told me.
There’s another somewhat deceptive trend in the year’s carbon removal data: The market also appeared to be highly concentrated within one carbon removal method — 75% of Microsoft’s purchases, and 70% of the total sales tracked by CDR.fyi, were credits for bioenergy with carbon capture, where biomass is burned for energy and the resulting emissions are captured and stored. Despite making up the largest volume of credits, however, these were actually just a rare few deals. “It’s the least common method,” Hoglund said.
Companies reported delivering about 450,000 tons of carbon removal this year, according to CDR.fyi’s data, bringing the cumulative total to over 1 million tons to date. Some 80% of the total came from biochar projects, but the remaining deliveries run the gamut of carbon removal methods, including ocean-based techniques and enhanced rock weathering.
Amador predicted that in the near-term, we may see increased buying from the tech sector, as the growth of artificial intelligence and power-hungry data centers sets those companies’ further back on their climate commitments. She’s also optimistic about a growing trend of exploring “industrial integrations” — basically incorporating carbon removal into existing industrial processes such as municipal waste management, agricultural operations, wastewater treatment, mining, and pulp and paper factories. “I think that's something that we'll see a spotlight on next year,” she said.
Another place that may help unlock demand is the Science Based Targets initiative, a nonprofit that develops voluntary standards for corporate climate action. The group has been in the process of revising its Net-Zero Standard, which will give companies more direction about what role carbon removal should play in their sustainability strategies.
The question is whether any of these policy developments will come soon enough or be significant enough to sustain this capital-intensive, immature industry long enough for it to prove its utility. Investment in the industry has been predicated on the idea that demand for carbon removal will grow, Hoglund told me. If growth continues at the pace we saw this year, it’s going to get a lot harder for startups to raise their series B or C.
“When you can't raise that, and you haven't sold enough to keep yourself afloat, then you go out of business,” he said. “I would expect quite a few companies to go out of business in 2026.”
Hoglund was quick to qualify his dire prediction, however, adding that these were normal growing pains for any industry and shouldn’t be viewed as a sign of failure. “It could be interpreted that way, and the vibe may shift, especially if you see a lot of the prolific companies come down,” he said. “But it’s natural. I think that’s something we should be prepared for and not panic about.”
America runs on natural gas.
That’s not an exaggeration. Almost half of home heating is done with natural gas, and around 40% — the plurality — of our electricity is generated with natural gas. Data center developers are pouring billions into natural gas power plants built on-site to feed their need for computational power. In its -260 degree Fahrenheit liquid form, the gas has attracted tens of billions of dollars in investments to export it abroad.
The energy and climate landscape in the United States going into 2026 — and for a long time afterward — will be largely determined by the forces pushing and pulling on natural gas. Those could lead to higher or more volatile prices for electricity and home heating, and even possibly to structural changes in the electricity market.
But first, the weather.
“Heating demand is still the main way gas is used in the U.S.,” longtime natural gas analyst Amber McCullagh explained to me. That makes cold weather — experienced and expected — the main driver of natural gas prices, even with new price pressures from electricity demand.
New sources of demand don’t help, however. While estimates for data center construction are highly speculative, East Daily Analytics figures cited by trade publication Natural Gas Intel puts a ballpark figure of new data center gas demand at 2.5 billion cubic feet per day by the end of next year, compared to 0.8 billion cubic feet per day for the end of this year. By 2030, new demand from data centers could add up to over 6 billion cubic feet per day of natural gas demand, East Daley Analytics projects. That’s roughly in line with the total annual gas production of the Eagle Ford Shale in southwest Texas.
Then there are exports. The U.S. Energy Information Administration expects outbound liquified natural gas shipments to rise to 14.9 billion cubic feet per day this year, and to 16.3 billion cubic feet in 2026. In 2024, by contrast, exports were just under 12 billion cubic feet per day.
“Even as we’ve added demand for data centers, we’re getting close to 20 billion per day of LNG exports,” McCullagh said, putting more pressure on natural gas prices.
That’s had a predictable effect on domestic gas prices. Already, the Henry Hub natural gas benchmark price has risen to above $5 per million British thermal units earlier this month before falling to $3.90, compared to under $3.50 at the end of last year. By contrast, LNG export prices, according to the most recent EIA data, are at around $7 per million BTUs.
This yawning gap between benchmark domestic prices and export prices is precisely why so many billions of dollars are being poured into LNG export capacity — and why some have long been wary of it, including Democratic politicians in the Northeast, which is chronically short of natural gas due to insufficient pipeline infrastructure. A group of progressive Democrats in Congress wrote a letter to Secretary of Energy Chris Wright earlier this year opposing additional licenses for LNG exports, arguing that “LNG exports lead to higher energy prices for both American families and businesses.”
Industry observers agree — or at least agree that LNG exports are likely to pull up domestic prices. “Henry Hub is clearly bullish right now until U.S. gas production catches up,” Ira Joseph, a senior research associate at the Center for Global Energy Policy at Columbia University, told me. “We’re definitely heading towards convergence” between domestic and global natural gas prices.
But while higher natural gas prices may seem like an obvious boon to renewables, the actual effect may be more ambiguous. The EIA expects the Henry Hub benchmark to average $4 per million BTUs for 2026. That’s nothing like the $9 the benchmark hit in August 2022, the result of post-COVID economic restart, supply tightness, and the Russian invasion of Ukraine.
Still, a tighter natural gas market could mean a more volatile electricity and energy sector in 2026. The United States is basically unique globally in having both large-scale domestic production of coal and natural gas that allows its electricity generation to switch between them. When natural gas prices go up, coal burning becomes more economically attractive.
Add to that, the EIA forecasts that electricity generation will have grown 2.4% by the end of 2025, and will grow another 1.7% in 2026, “in contrast to relatively flat generation from 2010 to 2020. That is “primarily driven by increasing demand from large customers, including data centers,” the agency says.
This is the load growth story. With the help of the Trump administration, it’s turning into a coal growth story, too.
Already several coal plants have extended out their retirement dates, either to maintain reliability on local grids or because the Trump administration ordered them to. In America’s largest electricity market, PJM Interconnection, where about a fifth of the installed capacity is coal, diversified energy company Alliance Resource Partners expects 4% to 6% demand growth, meaning it might even be able to increase coal production. Coal consumption has jumped 16% in PJM in the first nine months of 2025, the company’s Chairman Joseph Kraft told analysts.
“The domestic thermal coal market is continuing to experience strong fundamentals, supported by an unprecedented combination of federal energy and environmental policy support plus rapid demand growth,” Kraft said in a statement accompanying the company’s October third quarter earnings report. He pointed specifically to “natural gas pricing dynamics” and “the dramatic load growth required by artificial intelligence.”
Observers are also taking notice. “The key driver for coal prices remains strong natural gas prices,” industry newsletter The Coal Trader wrote.
In its December short term outlook, the EIA said that it expects “coal consumption to increase by 9% in 2025, driven by an 11% increase in coal consumption in the electric power sector this year as both natural gas costs and electricity demand increased,” while falling slightly in 2026 (compared to 2025), leaving coal consumption sill above 2024 levels.
“2025 coal generation will have increased for the first time since the last time gas prices spiked,” McCullagh told me.
Assuming all this comes to pass, the U.S.’s total carbon dioxide emissions will have essentially flattened out at around 4.8 million metric tons. The ultimate cost of higher natural gas prices will likely be felt far beyond the borders of the United States and far past 2026.
Lawmakers today should study the Energy Security Act of 1980.
The past few years have seen wild, rapid swings in energy policy in the United States, from President Biden’s enthusiastic embrace of clean energy to President Trump’s equally enthusiastic re-embrace of fossil fuels.
Where energy industrial policy goes next is less certain than any other moment in recent memory. Regardless of the direction, however, we will need creative and effective policy tools to secure our energy future — especially for those of us who wish to see a cleaner, greener energy system. To meet the moment, we can draw inspiration from a largely forgotten piece of energy industrial policy history: the Energy Security Act of 1980.
After a decade of oil shocks and energy crises spanning three presidencies, President Carter called for — and Congress passed — a new law that would “mobilize American determination and ability to win the energy war.” To meet that challenge, lawmakers declared their intent “to utilize to the fullest extent the constitutional powers of the Congress” to reduce the nation’s dependence on imported oil and shield the economy from future supply shocks. Forty-five years later, that brief moment of determined national mobilization may hold valuable lessons for the next stage of our energy industrial policy.
The 1970s were a decade of energy volatility for Americans, with spiking prices and gasoline shortages, as Middle Eastern fossil fuel-producing countries wielded the “oil weapon” to throttle supply. In his 1979 “Crisis of Confidence” address to the nation, Carter warned that America faced a “clear and present danger” from its reliance on foreign oil and urged domestic producers to mobilize new energy sources, akin to the way industry responded to World War II by building up a domestic synthetic rubber industry.
To develop energy alternatives, Congress passed the Energy Security Act, which created a new government-run corporation dedicated to investing in alternative fuels projects, a solar bank, and programs to promote geothermal, biomass, and renewable energy sources. The law also authorized the president to create a system of five-year national energy targets and ordered one of the federal government’s first studies on the impacts of greenhouse gases from fossil fuels.
Carter saw the ESA as the beginning of an historic national mission. “[T]he Energy Security Act will launch this decade with the greatest outpouring of capital investment, technology, manpower, and resources since the space program,” he said at the signing. “Its scope, in fact, is so great that it will dwarf the combined efforts expended to put Americans on the Moon and to build the entire Interstate Highway System of our country.” The ESA was a recognition that, in a moment of crisis, the federal government could revive the tools it once used in wartime to meet an urgent civilian challenge.
In its pursuit of energy security, the Act deployed several remarkable industrial policy tools, with the Synthetic Fuels Corporation as the centerpiece. The corporation was a government-run investment bank chartered to finance — and in some cases, directly undertake — alternative fuels projects, including those derived from coal, shale, and oil.. Regardless of the desirability or feasibility of synthetic fuels, the SFC as an institution illustrates the type of extraordinary authority Congress was once willing to deploy to address energy security and stand up an entirely new industry. It operated outside of federal agencies, unencumbered by the normal bureaucracy and restrictions that apply to government.
Along with everything else created by the ESA, the Sustainable Fuels Corporation was also financed by a windfall profits tax assessed on oil companies, essentially redistributing income from big oil toward its nascent competition. Both the law and the corporation had huge bipartisan support, to the tune of 317 votes for the ESA in the House compared to 93 against, and 78 to 12 in the Senate.
The Synthetic Fuels Corporation was meant to be a public catalyst where private investment was unlikely to materialize on its own. Investors feared that oil prices could fall, or that OPEC might deliberately flood the market to undercut synthetic fuels before they ever reached scale. Synthetic fuel projects were also technically complex, capital-intensive undertakings, with each plant costing several billion dollars, requiring up to a decade to plan and build.
To address this, Congress equipped the corporation with an unusually broad set of tools. The corporation could offer loans, loan guarantees, price guarantees, purchase agreements, and even enter joint ventures — forms of support meant to make first-of-a-kind projects bankable. It could assemble financing packages that traditional lenders viewed as too risky. And while the corporation was being stood up, the president was temporarily authorized to use Defense Production Act powers to initiate early synthetic fuel projects. Taken together, these authorities amounted to a federal attempt to build an entirely new energy industry.
While the ESA gave the private sector the first shot at creating a synthetic fuels industry, it also created opportunities for the federal government to invest. The law authorized the Synthetic Fuels Corporation to undertake and retain ownership over synthetic fuels construction projects if private investment was insufficient to meet production targets. The SFC was also allowed to impose conditions on loans and financial assistance to private developers that gave it a share of project profits and intellectual property rights arising out of federally-funded projects. Congress was not willing to let the national imperative of energy security rise or fall on the whims of the market, nor to let the private sector reap publicly-funded windfalls.
Employing logic that will be familiar to many today, Carter was particularly concerned that alternative fuel sources would be unduly delayed by permitting rules and proposed an Energy Mobilization Board to streamline the review process for energy projects. Congress ultimately refused to create it, worried it would trample state authority and environmental protections. But the impulse survived elsewhere. At a time when the National Environmental Policy Act was barely 10 years old and had become the central mechanism for scrutinizing major federal actions, Congress provided an exemption for all projects financed by the Synthetic Fuels Corporation, although other technologies supported in the law — like geothermal energy — were still required to go through NEPA review. The contrast is revealing — a reminder that when lawmakers see an energy technology as strategically essential, they have been willing not only to fund it but also to redesign the permitting system around it.
Another forgotten feature of the corporation is how far Congress went to ensure it could actually hire top tier talent. Lawmakers concluded that the federal government’s standard pay scales were too low and too rigid for the kind of financial, engineering, and project development expertise the Synthetic Fuels Corporation needed. So it gave the corporation unusual salary flexibility, allowing it to pay above normal civil service rates to attract people with the skills to evaluate multibillion dollar industrial projects. In today’s debates about whether federal agencies have the capacity to manage complex clean energy investments, this detail is striking. Congress once knew that ambitious industrial policy requires not just money, but people who understand how deals get done.
But the Energy Security Act never had the chance to mature. The corporation was still getting off the ground when Carter lost the 1980 election to Ronald Reagan. Reagan’s advisers viewed the project as a distortion of free enterprise — precisely the kind of government intervention they believed had fueled the broader malaise of the 1970s. While Reagan had campaigned on abolishing the Department of Energy, the corporation proved an easier and more symbolic target. His administration hollowed it out, leaving it an empty shell until Congress defunded it entirely in 1986.
At the same time, the crisis atmosphere that had justified the Energy Security Act began to wane. Oil prices fell nearly 60% during Reagan’s first five years, and with them the political urgency behind alternative fuels. Drained of its economic rationale, the synthetic fuels industry collapsed before it ever had a chance to prove whether it could succeed under more favorable conditions. What had looked like a wartime mobilization suddenly appeared to many lawmakers to be an expensive overreaction to a crisis that had passed.
Yet the ESA’s legacy is more than an artifact of a bygone moment. It offers at least three lessons that remain strikingly relevant today:
As we now scramble to make up for lost time, today’s clean energy push requires institutions that can survive electoral swings. Nearly half a century after the ESA, we must find our way back to that type of institutional imagination to meet the energy challenges we still face.