You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
Money is pouring in — and deadlines are approaching fast.
There’s no quick fix for decarbonizing medium- and long-distance flights. Batteries are typically too heavy, and hydrogen fuel takes up too much space to offer a practical solution, leaving sustainable aviation fuels made from plants and other biomass, recycled carbon, or captured carbon as the primary options. Traditionally, this fuel is much more expensive — and the feedstocks for it much more scarce — than conventional petroleum-based jet fuel. But companies are now racing to overcome these barriers, as recent months have seen backers throw hundreds of millions behind a series of emergent, but promising solutions.
Today, most SAF is made of feedstocks such as used cooking oil and animal fats, from companies such as Neste and Montana Renewables. But this supply is limited by, well, the amount of cooking oil or fats restaurants and food processing facilities generate, and is thus projected to meet only about 10% of total SAF demand by 2050, according to a 2022 report by the Mission Possible Partnership. Beyond that, companies would have to start growing new crops just to make into fuel.
That creates an opportunity for developers of second-generation SAF technologies, which involve making jet fuel out of captured carbon or alternate biomass sources, such as forest waste. These methods are not yet mature enough to make a significant dent in 2030 targets, such as the EU's mandate to use 6% SAF and the U.S. government’s goal of producing 3 billion gallons of SAF per year domestically. But this tech will need to be a big part of the equation in order to meet the aviation sector’s overall goal of net zero emissions by 2050, as well as the EU’s sustainable fuels mandate, which increases to 20% by 2035 and 70% by 2050 for all flights originating in the bloc.
“That’s going to be a massive jump because currently, SAF uptake is about 0.2% of fuel,” Nicole Cerulli, a research associate for transportation and logistics at the market research firm Cleantech Group, told me. The head of the airline industry’s trade association, Willie Walsh, said in December at a media day event, "We’re not making as much progress as we’d hoped for, and we’re certainly not making as much progress as we need.” While global SAF production doubled to 1 million metric tons in 2024, that fell far below the trade group’s projection of 1.5 million metric tons, made at the end of 2023.
Producing SAF requires making hydrocarbons that mirror those used in traditional jet fuel. We know how to do that, but the processes required — electrolysis, gasification, and the series of chemical reactions known as Fischer-Tropsch synthesis — are energy intensive. So finding a way to power all of this sustainably while simultaneously scaling to meet demand is a challenging and expensive task.
Aamir Shams, a senior associate at the energy think tank RMI whose work focuses on driving demand for SAF, told me that while sustainable fuel is undeniably more expensive than traditional fuel, airlines and corporations have so far been willing to pay the premium. “We feel that the lag is happening because we just don’t have the fuel today,” Shams said. “Whatever fuel shows up, it just flies off the shelves.”
Twelve, a Washington-based SAF producer, thinks its e-fuels can help make a dent. The company is looking to produce jet fuel initially by recycling the CO2 emitted from the ethanol, pulp, and paper industries. In September, the company raised $645 million to complete the buildout of its inaugural SAF facility in Washington state, support the development of future plants, and pursue further R&D. The funding includes $400 million in project equity from the impact fund TPG Rise Climate, $200 million in Series C financing led by TPG, Capricorn Investment Group, and Pulse Fund, and $45 million in loans. The company has also previously partnered with the Air Force to explore producing fuel on demand in hard to reach areas.
Nicholas Flanders, Twelve’s CEO, told me that the company is starting with ethanol, pulp, and paper because the CO2 emissions from these facilities are relatively concentrated and thus cheaper to capture. And unlike, say, coal power plants, these industries aren’t going anywhere fast, making them a steady source of carbon. To turn the captured CO2 into sustainable fuel, the company needs just one more input — water. Renewable-powered electrolyzers then break apart the CO2 and H2O into their constituent parts, and the resulting carbon monoxide and hydrogen are combined to create a syngas. That then gets put through a chemical reaction known as “Fischer-Tropsch synthesis,” where the syngas reacts with catalysts to form hydrocarbons, which are then processed into sustainable jet fuel and ultimately blended with conventional fuel.
Twelve says its proprietary CO2 electrolyzer can break apart CO2 at much lower temperatures than would typically be required for this molecule, which simplifies the whole process, making it easier to ramp the electrolyzers up and down to match the output of intermittent renewables. (How does it do this? The company didn’t respond when I asked.) Twelve’s first plant, which sources carbon from a nearby ethanol facility, is set to come online next year, producing 50,000 gallons of SAF annually once it’s fully scaled, with electrolyzers that will run on hydropower.
While Europe may have stricter, actually enforceable SAF requirements than the U.S., Flanders told me there’s a lot of promise in domestic production. “I think the U.S. has an exciting combination of relatively low-cost green electricity, lots of biogenic CO2 sources, a lot of demand for the product we’re making, and then the inflation Reduction Act and state level incentives can further enhance the economics.” Currently, the IRA provides SAF producers with a baseline $1.25 tax credit per gallon produced, which gradually increases the greener the fuel gets. Of course, whether or not the next Congress will rescind this is anybody’s guess.
Down the line, incentives and mandates will end up mattering a whole lot. Making SAF simply costs a whole lot more than producing jet fuel the standard way, by refining crude oil. But in the meantime, Twelve is setting up cost-sharing partnerships between airlines that want to reduce their direct emissions (scope 1) and large corporations that want to reduce their indirect emissions (scope 3), which include employee business travel.
For example, Twelve has offtake agreements with Seattle-based Alaska Airlines and Microsoft for the fuel produced at its initial Washington plant. Microsoft, which aims to reduce emissions from its employees’ flights, will essentially cover the cost premium associated with Twelve’s more expensive SAF fuel, making it cost-effective for Alaska to use in its fleet. Twelve has a similar agreement with Boston Consulting Group and an unnamed airline
Eventually, Flanders told me, the company expects to source carbon via direct air capture, but doing so today would be prohibitively expensive. “If there were a customer who wanted to pay the additional amount to use DAC today, we'd be very happy to do that,” Flanders said. “But our perspective is it will maybe be another decade before that cost starts to converge.”
No sustainable fuel is even close to cost parity yet — Cerulli told me that it generally comes with a “roughly 250% to over 800%” cost premium over conventional jet fuel. So while voluntary uptake by companies such as Microsoft and BCG are helping drive the emergent market today, that won’t be near enough to decarbonize the industry. “At the simplest level, the cost of not using SAF has to be higher than using it,” Cerulli told me.
Pathway Energy thinks that by incorporating carbon sequestration into its process, it can help the world get there. The sustainable fuels company, which emerged from stealth just last month, is pursuing what CEO Steve Roberts told me is “probably the most cost-efficient long-term pathway from a decarbonization perspective.” The company is building a $2 billion SAF plant in Port Arthur, Texas designed to produce about 30 million gallons of jet fuel annually — enough to power about 5,000 carbon-neutral 10-hour flights — while also permanently sequestering more than 1.9 million tons of CO2.
Pathway, a subsidiary of the investment and advisory firm Nexus Holdings, has partnered with the UK-based renewable energy company Drax, which will supply the company with 1 million metric tons of wood pellets, to be turned into fuel using a series of well-established technologies. The first step is to gasify the biomass by heating the pellets to high temperatures in the absence of oxygen to produce a syngas. Then, just as Twelve does, it puts the syngas through the Fischer-Tropsch process to form the hydrocarbons that become SAF.
The competitive advantage here is capturing the emissions from the fuel production process itself and storing them permanently underground. Since Pathway is burying CO2 that’s already been captured by the trees from which the wood pellets come, that would make Pathway’s SAF carbon-negative, in theory, while the best Twelve and similar companies can hope for is carbon neutrality, assuming all of their captured carbon is used to produce fuel.
The choice of Drax as a feedstock partner is not without controversy, however, as the BBC revealed that the company sources much of its wood from rare old-growth forests. Though this is technically legal, it’s also ecologically disruptive. Roberts told me Drax’s sourcing methodologies have been verified by third parties, and Pathway isn’t concerned. “I don't think any of that controversy has yielded any actually significant changes to their sourcing program at all, because we believe that they're compliant,” Roberts told me. “We are 100% certain that they’re meeting all the standards and expectations.”
Pathway has big growth plans, which depend on the legitimacy of its sustainability cred. Beyond the Port Arthur facility, which Roberts told me will begin production by the end of 2029 or early 2030, the company has a pipeline of additional facilities along the Gulf Coast in the works. It also has global ambitions. “When you have a fuel that is this negative, it really opens up a global market, because you can transport fuel out of Texas, whether that be into the EU, Africa, Asia, wherever it may be,” Roberts said, explaining that even substantial transportation-related emissions would be offset by the carbon-negativity of the fuel.
But alternative feedstocks such as forestry biomass are finite resources, too. That’s why many experts think that within the SAF sector, e-fuels such as Twelve’s that could one day source carbon via direct air capture and then electrolyze it have the greatest potential for growth. “It’s extremely dependent on getting sustainable CO2 and cheap electricity prices so that you can make cheap green hydrogen,” Shams told me. “But theoretically, it is unlimited in terms of what your total cap on production would be.”
In the meantime, airlines are focused on making their planes and engines more aerodynamic and efficient so that they don’t consume as much fuel in the first place. They’re also exploring other technical pathways to decarbonization — because after all, SAF will only be a portion of the solution, as many short and medium-length flights could likely be powered by batteries or hydrogen fuel. RMI forecasts that by 2050, 45% of global emissions reduction in the aviation sector will come from improvements in fuel efficiency, 37% will be due to SAF deployment, 7% will come from hydrogen, and 3.5% will come from electrification.
If you did the mental math, you’ll notice these numbers add up to 92.5% — not 100%. “What we have done is, let's look at what we are actually doing today and for the past three, four, five years, and let's see if we get to net zero or not. And the answer is, no. We don't get to net zero by 2050,” Shams told me. And while getting to 92.5% is nothing to scoff at, that means that the aviation sector would still be emitting about 700 million metric tons of CO2 equivalent by that time.
So what’s to be done? “The financing sector needs to step up its game and take a little bit more of a risk than they are used to,” Shams told me, noting that one of RMI’s partners, the Mission Possible Partnership, estimates that getting the aviation sector to net zero will require an investment of around $170 billion per year, a total of about $4.5 trillion by 2050. These numbers take a variety of factors into account beyond strictly SAF production, such as airport infrastructure for new fuels, building out direct air capture plants, etc.
But any way you cut it, it’s a boatload of money that certainly puts Pathway’s $2 billion SAF facility and Twelve’s $645 million funding round in perspective. And it’s far from certain that we can get there. “Increasingly, that goal of the 2050 net-zero target looks really difficult to achieve,” Shams put it simply. “Commitments are always going up, but more can be done.”
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Why regional transmission organizations as we know them might not survive the data center boom.
As the United States faces its first significant increase in electricity demand in decades, the grid itself is not only aging, but also straining against the financial, logistical, and legal barriers to adding new supply. It’s enough to make you wonder: What’s the point of an electricity market, anyway?
That’s the question some stakeholders in the PJM Interconnection, America’s largest electricity market, started asking loudly and in public in response to the grid operator’s proposal that new large energy users could become “non-capacity backed load,” i.e. be forced to turn off if ever and whenever PJM deems it necessary.
PJM, which covers 13 states from the Mid-Atlantic to the Midwest, has been America’s poster child for the struggle to get new generation online as data center development surges. PJM has warned that it will have “just enough generation to meet its reliability requirement” in 2026 and 2027, and its independent market monitor has said that the costs associated with serving that new and forecast demand have already reached the billions, translating to higher retail electricity rates in several PJM states.
As Heatmap has covered, however, basically no one in the PJM system — transmission owners, power producers, and data center developers — was happy with the details of PJM’s plan to deal with the situation. In public comments on the proposed rule, many brought up a central conflict between utilities’ historic duty to serve and the realities of the modern power market. More specifically, electricity markets like PJM are supposed to deal with wholesale electricity sales, not the kind of core questions of who gets served and when, which are left to the states.
On the power producer side, major East Coast supplier Talen Energy wrote, “The NCBL proposal exceeds PJM’s authority by establishing a regime where PJM holds the power to withhold electric service unlawfully from certain categories of large load.” The utility Exelon added that owners of transmission “have a responsibility to serve all customers—large, small, and in between. We are obligated to provide both retail and wholesale electric service safely and reliably.” And last but far from least, Microsoft, which has made itself into a leader in artificial intelligence, argued, “A PJM rule curtailing non-capacity-backed load would not only unlawfully intrude on state authority, but it would also fundamentally undercut the very purpose of PJM’s capacity market.”
This is just one small piece of a debate that’s been heating up for years, however, as more market participants, activists, and scholars question whether the markets that govern much of the U.S. electric grid are delivering power as cheaply and abundantly as they were promised to. Some have even suggested letting PJM utilities build their own power plants again, effectively reversing the market structure of the past few decades.
But questioning whether all load must be served would be an even bigger change.
The “obligation to serve all load has been a core tenet of electricity policy,” Rob Gramlich, the president of Grid Strategies LLC, told me. “I don’t recall ever seeing that be questioned or challenged in any fundamental way” — an illustration of how dire things have become.
The U.S. electricity system was designed for abundance. Utilities would serve any user, and the per-user costs of developing the fixed infrastructure necessary to serve them would drop as more users signed up.
But the planned rush of data center investments threatens to stick all ratepayers with the cost of new transmission and generation that is overwhelmingly from one class of customer. There is already a brewing local backlash to new data centers, and electricity prices have been rising faster than inflation. New data center load could also have climate consequences if utilities decide to leave aging coal online and build out new natural gas-fired power plants over and above their pre-data center boom (and pre-Trump) plans.
“AI has dramatically raised the stakes, along with enhancing worries that heightened demand will mean more burning of fossil fuels,” law professors Alexandra Klass of the University of Michigan and Dave Owen at the University of California write in a preprint paper to be published next year.
In an interview, Klass told me, “There are huge economic and climate implications if we build a whole lot of gas and keep coal on, and then demand is lower because the chips are better,” referring to the possibility that data centers and large language models could become dramatically more energy efficient, rendering the additional fossil fuel-powered supply unnecessary. Even if the projects are not fully built out or utilized, the country could face a situation where “ratepayers have already paid for [grid infrastructure], whether it’s through those wholesale markets or through their utilities in traditionally regulated states,” she said.
The core tension between AI development and the power grid, Klass and Owen argue, is the “duty to serve,” or “universal service” principle that has underlain modern electricity markets for over a century.
“The duty to serve — to meet need at pretty much all times — worked for utilities because they got to pass through their costs, and it largely worked for consumers because they didn’t have to deal very often with unpredictable blackouts,” Owen told me.
“Once you knew how to build transmission lines and build power plants,” Klass added, “there was no sense that you couldn’t continue to build to serve all customers. “We could build power plants, and the regulatory regime came up in a context where we could always build enough to meet demand.”
How and why goes back to the earliest days of electrification.
As the power industry developed in the late 19th and early 20th century, the regulated utility model emerged where monopoly utilities would build both power plants and the transmission and distribution infrastructure necessary to serve that power to customers. So that they would be able to achieve the economies of scale required to serve said customers efficiently and affordably, regulators allowed them to establish monopolies over certain service territories, with the requirement that they would serve any and everyone in them.
With a secure base of ratepayers, utilities could raise money from investors to build infrastructure, which could then be put into a “rate base” and recouped from ratepayers over time at a fixed return. In exchange, the utilities accepted regulation from state governments over their pricing and future development trajectories.
That vertically integrated system began to crack, however, as ratepayers revolted over high costs from capital investments by utilities, especially from nuclear power plants. Following the deregulation of industries such as trucking and air travel, federal regulators began to try to break up the distribution and generation portions of the electricity industry. In 1999, after some states and regions had already begun to restructure their electricity markets, the Federal Energy Regulatory Commission encouraged the creation of regional transmission organizations like PJM.
Today some 35 state electricity markets are partially or entirely restructured, with Texas operating its own, isolated electricity market beyond the reach of federal regulation. In PJM and other RTOs, electricity is (more or less) sold competitively on a wholesale basis by independent power producers to utilities, who then serve customers.
But the system as it’s constructed now may, critics argue, expose retail customers to unacceptable cost increases — and greenhouse gas emissions — as it attempts to grapple with serving new data center load.
Klass and Owen, for their part, point to other markets as models for how electricity could work that don’t involve the same assumptions of plentiful supply that electricity markets historically have, such as those governing natural gas or even Western water rights.
Interruptions of natural gas service became more common starting in the 1970s, when some natural gas services were underpriced thanks to price caps, leading to an imbalance between supply and demand. In response, regulators “established a national policy of curtailment based on end use,” Klass and Owen write, with residential users getting priority “because of their essential heating needs, followed by firm industrial and commercial customers, and finally, interruptible customers.” Natural gas was deregulated in the late 1970s and 1980s, with curtailment becoming more market-based, which also allowed natural gas customers to trade capacity with each other.
Western water rights, meanwhile, are notoriously opaque and contested — but, importantly, they are based on scarcity, and thus may provide lessons in an era of limited electricity supply. The “prior appropriation” system water markets use is, “at its core, a set of mechanisms for allocating shortage,” the authors write. Water users have “senior” and “junior” rights, with senior users “entitled to have their rights fulfilled before the holders of newer, or more ’junior,’ water rights.” These rights can be transferred, and junior users have found ways to work with what water they can get, with the authors citing extensive conservation efforts in Southern California compared to the San Francisco Bay area, which tends to have more senior rights.
With these models in mind, Klass and Owen propose a system called “demand side connect-and-manage,” whereby new loads would not necessarily get transmission and generation service at all times, and where utilities could curtail users and electricity customers would have the ability “to use trading to hedge against the risk of curtailments.”
“We can connect you now before we build a whole lot of new generation, but when we need to, we’re going to curtail you,” Klass said, describing her and Owen’s proposal.
Tyler Norris, a Duke University researcher who has published concept-defining work on data center flexibility, called the paper “one of the most important contributions yet toward the re-examination of basic assumptions of U.S. electricity law that’s urgently needed as hyperscale load growth pushes our existing regulatory system beyond its limits.”
While electricity may not be literally drying up, he told me, “when you are supply side constrained while demand is growing, you have this challenge of, how do you allocate scarcity?”
Unlike the PJM proposals, “Our paper was very focused on state law,” Klass told me. “And that was intentional, because I think this is trickier at the federal level,” she told me.
Some states are already embracing similar ideas. Ohio regulators, for instance, established a data center tariff that tries to protect customers from higher costs by forcing data centers to make minimum payments regardless of their actual electricity use. Texas also passed a law that would allow for some curtailment of large loads and reforms of the interconnection process to avoid filling up the interconnection queue with speculative projects that could result in infrastructure costs but not real electricity demand.
Klass and Owen write that their idea may be more of “a temporary bridging strategy, primarily for periods when peak demand outstrips supply or at least threatens to do so.”
Even those who don’t think the principles underlying electricity markets need to be rethought see the need — at least in the short term — for new options for large new power users who may not get all the power they want all of the time.
“Some non-firm options are necessary in the short term,” Gramlich told me, referring to ideas like Klass and Owen’s, Norris’s, and PJM’s. “Some of them are going to have some legal infirmities and jurisdictional problems. But I think no matter what, we’re going to see some non-firm options. A lot of customers, a lot of these large loads, are very interested, even if it’s a temporary way to get connected while they try to get the firm service later.”
If electricity markets have worked for over one hundred years on the principle that more customers could bring down costs for everyone, going forward, we may have to get more choosy — or pay the price.
A judge has lifted the administration’s stop-work order against Revolution Wind.
A federal court has lifted the Trump administration’s order to halt construction on the Revolution Wind farm off the coast of New England. The decision marks the renewables industry’s first major legal victory against a federal war on offshore wind.
The Interior Department ordered Orsted — the Danish company developing Revolution Wind — to halt construction of Revolution Wind on August 22, asserting in a one-page letter that it was “seeking to address concerns related to the protection of national security interests of the United States and prevention of interference with reasonable uses of the exclusive economic zone, the high seas, and the territorial seas.”
In a two-page ruling issued Monday, U.S. District Judge Royce Lamberth found that Orsted would presumably win its legal challenge against the stop work order, and that the company is “likely to suffer irreparable harm in the absence of an injunction,” which led him to lift the dictate from the Trump administration.
Orsted previously claimed in legal filings that delays from the stop work order could put the entire project in jeopardy by pushing its timeline beyond the terms of existing power purchase agreements, and that the company installing cable for the project only had a few months left to work on Revolution Wind before it had to move onto other client obligations through mid-2028. The company has also argued that the Trump administration is deliberately mischaracterizing discussions between the federal government and the company that took place before the project was fully approved.
It’s still unclear at this moment whether the Trump administration will appeal the decision. We’re still waiting on the outcome of a separate legal challenge brought by Democrat-controlled states against Trump’s anti-wind Day One executive order.
Harmonizing data across federal agencies will go a long, long way toward simplifying environmental reviews.
Comprehensive permitting reform remains elusive.
In spite of numerous promising attempts — the Fiscal Responsibility Act of 2023, for instance, which delivered only limited improvements, and the failed Manchin-Barrasso bill of last year — the U.S. has repeatedly failed to overhaul its clogged federal infrastructure approval process. Even now there are draft bills and agreements in principle, but the Trump administration’s animus towards renewable energy has undermined Democratic faith in any deal. Less obvious but no less important, key Republicans are quietly disengaged, hesitant to embrace the federal transmission reform that negotiators see as essential to the package.
Despite this grim prognosis, Congress could still improve implementation of a key permitting barrier, the National Environmental Policy Act, by fixing the federal government’s broken systems for managing and sharing NEPA documentation and data. These opaque and incompatible systems frustrate essential interagency coordination, contributing immeasurably to NEPA’s delays and frustrations. But it’s a problem with clear, available, workable solutions — and at low political cost.
Both of us saw these problems firsthand. Marc helped manage NEPA implementation at the Environmental Protection Agency, observing the federal government’s slow and often flailing attempts to use technology to improve internal agency processes. Elizabeth, meanwhile, spent two years overcoming NEPA’s atomized data ecosystem to create a comprehensive picture of NEPA litigation.
Even so, it’s difficult to illustrate the scope of the problem without experiencing it. Some agencies have bespoke systems to house crucial and unique geographic information on project areas. Other agencies lack ready access to that information, even as they examine project impacts another agency may have already studied. Similarly, there is no central database of scientific studies undertaken in support of environmental reviews. Some agencies maintain repositories for their environmental assessments — arduous but less intense environmental reviews than the environmental impact statements NEPA requires when a federal agency action substantially impacts the environment. But there’s still no unified, cross-agency EA database. This leaves agencies unable to efficiently find and leverage work that could inform their own reviews. Indeed, agencies may be duplicating or re-duplicating tedious, time-consuming efforts.
NEPA implementation also relies on interagency cooperation. There, too, agencies’ divergent ways of classifying and communicating about project data throws up impediments. Agencies rely on arcane data formats and often incompatible platforms. (For the tech-savvy, an agency might have a PDF-only repository while another has XML-based data formats.) With few exceptions, it’s difficult for cooperating agencies to even know the status of a given review. And it produces a comedy of errors for agencies trying to recruit and develop younger, tech-savvy staff. Your workplace might use something like Asana or Trello to guide your workflow, a common language all teams use to communicate. The federal government has a bureaucratic Tower of Babel.
Yet another problem, symptomatic of inadequate transparency, is that we have only limited data on the thousands of NEPA court cases. To close the gap, we sought to understand — using data — just how sprawling and unwieldy post-review NEPA litigation had become. We read every available district and appellate opinion that mentioned NEPA from 2013 to 2022 (over 2,000 cases), screened out those without substantive NEPA claims, and catalogued their key characteristics — plaintiffs, court timelines and outcomes, agencies, project types, and so on. Before we did this work, no national NEPA litigation database provided policymakers with actionable, data-driven insights into court outcomes for America’s most-litigated environmental statute. But even our painstaking efforts couldn’t unearth a full dataset that included, for example, decisions taken by administrative judges within agencies.
We can’t manage what we can’t measure. And every study in this space, including ours, struggles with this type of sample bias. Litigated opinions are neither random nor representative; they skew toward high-stakes disputes with uncertain outcomes and underrepresent cases that settle on clear agency error or are dismissed early for weak claims. Our database illuminates litigation patterns and timelines. But like the rest of the literature, it cannot offer firm conclusions about NEPA’s effectiveness. We need a more reliable universe of all NEPA reviews to have any chance — even a flawed one — at assessing the law’s outcomes.
In the meantime, NEPA policy debates often revolve unproductively around assumptions and anecdotes. For example, Democrats can point to instances when early and robust public engagement appeared essential for bringing projects to completion. But in the absence of hard data to support this view, GOP reformers often prefer to limit public participation in the name of speeding the review process. The rebuttal to that approach is persuasive: Failing to engage potential project opponents on their legitimate concerns merely drives them to interfere with the project outside the NEPA process. Yet this rebuttal relies on assumptions, not evidence. Only transparent data can resolve the dispute.
Some of the necessary repair work is already underway at the Council on Environmental Quality, the White House entity that coordinates and guides agencies’ NEPA implementation. In May, CEQ published a “NEPA and Permitting Data and Technology Standard” so that agencies could voluntarily align on how to communicate NEPA information with each other. Then in June, after years using a lumbering Excel file containing agencies’ categorical exclusions — the types of projects that don’t need NEPA review, as determined by law or regulation — CEQ unveiled a searchable database called the Categorical Exclusion Explorer. The Pacific Northwest National Laboratory’s PermitAI has leveraged the EPA’s repository of environmental impact statements and, more recently, environmental review documents from other agencies to create an AI-powered queryable database. The FAST-41 Dashboard has brought transparency and accountability to a limited number of EISs.
But across all these efforts, huge gaps in data, resources, and enforcement authority remain. President Trump has issued directives to agencies to speed environmental reviews, evincing an interest in filling the gaps. But those directives don’t and can’t compel the full scope of necessary technological changes.
Some members of Congress are tuned in and trying to do something about this. Representatives Scott Peters, a Democrat from California, and Dusty Johnson, Republican of South Dakota, deserve credit for introducing the bipartisan ePermit Act to address all of these challenges. They’ve identified key levers to improve interagency communication, track litigation, and create a common and publicly accessible storehouse of NEPA data. Crucially, they recognize the make-or-break role of agency Chief Information Officers who are accountable for information security. Our own attempts to upgrade agency technology taught us that the best way to do so is by working with — not around — CIOs who have a statutory mandate.
The ePermit Act would also lay the groundwork for more extensive and innovative deployment of artificial intelligence in NEPA processes. Despite AI’s continuing challenges around information accuracy and traceability, large language models may eventually be able to draft the majority of an EIS on their own, with humans involved to oversee.
AI can also address hidden pain points in the NEPA process. It can hasten the laborious summarization and incorporation of public comment, reducing the legal and practical risk that agencies miss crucial public feedback. It can also help determine whether sponsor applications are complete, frequently a point of friction between sponsors and agencies. AI can also assess whether projects could be adapted to a categorical exclusion, entirely removing unnecessary reviews. And finally, AI tools are a concession to the rapid turnover of NEPA personnel and depleted institutional knowledge — an acute problem of late.
Comprehensive, multi-agency legislation like the ePermit Act will take time to implement — Congress may want or even need to reform NEPA before we get the full benefit of technology improvements. But that does not diminish the urgency or value of this effort. Even Representative Jared Huffman of California, a key Democrat on the House Natural Resources Committee with impeccable environmental credentials, offered words of support for the ePermit Act, while opposing other NEPA reforms.
Regardless of what NEPA looks like in coming years, this work must begin at some point. Under every flavor of NEPA reform, agencies will need to share data, coordinate across platforms, and process information. That remains true even as court-driven legal reforms and Trump administration regulatory changes wreak havoc with NEPA’s substance and implementation. Indeed, whether or not courts, Congress, or the administration reduce NEPA’s reach, even truncated reviews would still be handicapped by broken systems. Fixing the technology infrastructure now is a way to future-proof NEPA.
The solution won’t be as simple as getting agencies to use Microsoft products. It’s long past time to give agencies the tools they need — an interoperable, government-wide platform for NEPA data and project management, supported by large language models. This is no simple task. To reap the full benefits of these solutions will require an act of Congress that both provides funding for multi-agency software and requires all agencies to act in concert. This mandate is necessary to induce movement from actors within agencies who are slow to respond to non-binding CEQ directives that take time away from statutorily required work, or those who resist discretionary changes to agency software as cybersecurity risks, no matter how benign those changes may be. Without appropriated money or congressional edict, the government’s efforts in this area will lack the resources and enforcement levers to ensure reforms take hold.
Technology improvements won’t cure everything that ails NEPA. This bill won’t fix the deep uncertainty unleashed by the legal chaos of the last year. But addressing these issues is a no-regrets move with bipartisan and potentially even White House support. Let it be done.