You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
Harmonizing data across federal agencies will go a long, long way toward simplifying environmental reviews.

Comprehensive permitting reform remains elusive.
In spite of numerous promising attempts — the Fiscal Responsibility Act of 2023, for instance, which delivered only limited improvements, and the failed Manchin-Barrasso bill of last year — the U.S. has repeatedly failed to overhaul its clogged federal infrastructure approval process. Even now there are draft bills and agreements in principle, but the Trump administration’s animus towards renewable energy has undermined Democratic faith in any deal. Less obvious but no less important, key Republicans are quietly disengaged, hesitant to embrace the federal transmission reform that negotiators see as essential to the package.
Despite this grim prognosis, Congress could still improve implementation of a key permitting barrier, the National Environmental Policy Act, by fixing the federal government’s broken systems for managing and sharing NEPA documentation and data. These opaque and incompatible systems frustrate essential interagency coordination, contributing immeasurably to NEPA’s delays and frustrations. But it’s a problem with clear, available, workable solutions — and at low political cost.
Both of us saw these problems firsthand. Marc helped manage NEPA implementation at the Environmental Protection Agency, observing the federal government’s slow and often flailing attempts to use technology to improve internal agency processes. Elizabeth, meanwhile, spent two years overcoming NEPA’s atomized data ecosystem to create a comprehensive picture of NEPA litigation.
Even so, it’s difficult to illustrate the scope of the problem without experiencing it. Some agencies have bespoke systems to house crucial and unique geographic information on project areas. Other agencies lack ready access to that information, even as they examine project impacts another agency may have already studied. Similarly, there is no central database of scientific studies undertaken in support of environmental reviews. Some agencies maintain repositories for their environmental assessments — arduous but less intense environmental reviews than the environmental impact statements NEPA requires when a federal agency action substantially impacts the environment. But there’s still no unified, cross-agency EA database. This leaves agencies unable to efficiently find and leverage work that could inform their own reviews. Indeed, agencies may be duplicating or re-duplicating tedious, time-consuming efforts.
NEPA implementation also relies on interagency cooperation. There, too, agencies’ divergent ways of classifying and communicating about project data throws up impediments. Agencies rely on arcane data formats and often incompatible platforms. (For the tech-savvy, an agency might have a PDF-only repository while another has XML-based data formats.) With few exceptions, it’s difficult for cooperating agencies to even know the status of a given review. And it produces a comedy of errors for agencies trying to recruit and develop younger, tech-savvy staff. Your workplace might use something like Asana or Trello to guide your workflow, a common language all teams use to communicate. The federal government has a bureaucratic Tower of Babel.
Yet another problem, symptomatic of inadequate transparency, is that we have only limited data on the thousands of NEPA court cases. To close the gap, we sought to understand — using data — just how sprawling and unwieldy post-review NEPA litigation had become. We read every available district and appellate opinion that mentioned NEPA from 2013 to 2022 (over 2,000 cases), screened out those without substantive NEPA claims, and catalogued their key characteristics — plaintiffs, court timelines and outcomes, agencies, project types, and so on. Before we did this work, no national NEPA litigation database provided policymakers with actionable, data-driven insights into court outcomes for America’s most-litigated environmental statute. But even our painstaking efforts couldn’t unearth a full dataset that included, for example, decisions taken by administrative judges within agencies.
We can’t manage what we can’t measure. And every study in this space, including ours, struggles with this type of sample bias. Litigated opinions are neither random nor representative; they skew toward high-stakes disputes with uncertain outcomes and underrepresent cases that settle on clear agency error or are dismissed early for weak claims. Our database illuminates litigation patterns and timelines. But like the rest of the literature, it cannot offer firm conclusions about NEPA’s effectiveness. We need a more reliable universe of all NEPA reviews to have any chance — even a flawed one — at assessing the law’s outcomes.
In the meantime, NEPA policy debates often revolve unproductively around assumptions and anecdotes. For example, Democrats can point to instances when early and robust public engagement appeared essential for bringing projects to completion. But in the absence of hard data to support this view, GOP reformers often prefer to limit public participation in the name of speeding the review process. The rebuttal to that approach is persuasive: Failing to engage potential project opponents on their legitimate concerns merely drives them to interfere with the project outside the NEPA process. Yet this rebuttal relies on assumptions, not evidence. Only transparent data can resolve the dispute.
Some of the necessary repair work is already underway at the Council on Environmental Quality, the White House entity that coordinates and guides agencies’ NEPA implementation. In May, CEQ published a “NEPA and Permitting Data and Technology Standard” so that agencies could voluntarily align on how to communicate NEPA information with each other. Then in June, after years using a lumbering Excel file containing agencies’ categorical exclusions — the types of projects that don’t need NEPA review, as determined by law or regulation — CEQ unveiled a searchable database called the Categorical Exclusion Explorer. The Pacific Northwest National Laboratory’s PermitAI has leveraged the EPA’s repository of environmental impact statements and, more recently, environmental review documents from other agencies to create an AI-powered queryable database. The FAST-41 Dashboard has brought transparency and accountability to a limited number of EISs.
But across all these efforts, huge gaps in data, resources, and enforcement authority remain. President Trump has issued directives to agencies to speed environmental reviews, evincing an interest in filling the gaps. But those directives don’t and can’t compel the full scope of necessary technological changes.
Some members of Congress are tuned in and trying to do something about this. Representatives Scott Peters, a Democrat from California, and Dusty Johnson, Republican of South Dakota, deserve credit for introducing the bipartisan ePermit Act to address all of these challenges. They’ve identified key levers to improve interagency communication, track litigation, and create a common and publicly accessible storehouse of NEPA data. Crucially, they recognize the make-or-break role of agency Chief Information Officers who are accountable for information security. Our own attempts to upgrade agency technology taught us that the best way to do so is by working with — not around — CIOs who have a statutory mandate.
The ePermit Act would also lay the groundwork for more extensive and innovative deployment of artificial intelligence in NEPA processes. Despite AI’s continuing challenges around information accuracy and traceability, large language models may eventually be able to draft the majority of an EIS on their own, with humans involved to oversee.
AI can also address hidden pain points in the NEPA process. It can hasten the laborious summarization and incorporation of public comment, reducing the legal and practical risk that agencies miss crucial public feedback. It can also help determine whether sponsor applications are complete, frequently a point of friction between sponsors and agencies. AI can also assess whether projects could be adapted to a categorical exclusion, entirely removing unnecessary reviews. And finally, AI tools are a concession to the rapid turnover of NEPA personnel and depleted institutional knowledge — an acute problem of late.
Comprehensive, multi-agency legislation like the ePermit Act will take time to implement — Congress may want or even need to reform NEPA before we get the full benefit of technology improvements. But that does not diminish the urgency or value of this effort. Even Representative Jared Huffman of California, a key Democrat on the House Natural Resources Committee with impeccable environmental credentials, offered words of support for the ePermit Act, while opposing other NEPA reforms.
Regardless of what NEPA looks like in coming years, this work must begin at some point. Under every flavor of NEPA reform, agencies will need to share data, coordinate across platforms, and process information. That remains true even as court-driven legal reforms and Trump administration regulatory changes wreak havoc with NEPA’s substance and implementation. Indeed, whether or not courts, Congress, or the administration reduce NEPA’s reach, even truncated reviews would still be handicapped by broken systems. Fixing the technology infrastructure now is a way to future-proof NEPA.
The solution won’t be as simple as getting agencies to use Microsoft products. It’s long past time to give agencies the tools they need — an interoperable, government-wide platform for NEPA data and project management, supported by large language models. This is no simple task. To reap the full benefits of these solutions will require an act of Congress that both provides funding for multi-agency software and requires all agencies to act in concert. This mandate is necessary to induce movement from actors within agencies who are slow to respond to non-binding CEQ directives that take time away from statutorily required work, or those who resist discretionary changes to agency software as cybersecurity risks, no matter how benign those changes may be. Without appropriated money or congressional edict, the government’s efforts in this area will lack the resources and enforcement levers to ensure reforms take hold.
Technology improvements won’t cure everything that ails NEPA. This bill won’t fix the deep uncertainty unleashed by the legal chaos of the last year. But addressing these issues is a no-regrets move with bipartisan and potentially even White House support. Let it be done.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
According to a new analysis shared exclusively with Heatmap, coal’s equipment-related outage rate is about twice as high as wind’s.
The Trump administration wants “beautiful clean coal” to return to its place of pride on the electric grid because, it says, wind and solar are just too unreliable. “If we want to keep the lights on and prevent blackouts from happening, then we need to keep our coal plants running. Affordable, reliable and secure energy sources are common sense,” Chris Wright said on X in July, in what has become a steady drumbeat from the administration that has sought to subsidize coal and put a regulatory straitjacket around solar and (especially) wind.
This has meant real money spent in support of existing coal plants. The administration’s emergency order to keep Michigan’s J.H. Campbell coal plant open (“to secure grid reliability”), for example, has cost ratepayers served by Michigan utility Consumers Energy some $80 million all on its own.
But … how reliable is coal, actually? According to an analysis by the Environmental Defense Fund of data from the North American Electric Reliability Corporation, a nonprofit that oversees reliability standards for the grid, coal has the highest “equipment-related outage rate” — essentially, the percentage of time a generator isn’t working because of some kind of mechanical or other issue related to its physical structure — among coal, hydropower, natural gas, nuclear, and wind. Coal’s outage rate was over 12%. Wind’s was about 6.6%.
“When EDF’s team isolated just equipment-related outages, wind energy proved far more reliable than coal, which had the highest outage rate of any source NERC tracks,” EDF told me in an emailed statement.
Coal’s reliability has, in fact, been decreasing, Oliver Chapman, a research analyst at EDF, told me.
NERC has attributed this falling reliability to the changing role of coal in the energy system. Reliability “negatively correlates most strongly to capacity factor,” or how often the plant is running compared to its peak capacity. The data also “aligns with industry statements indicating that reduced investment in maintenance and abnormal cycling that are being adopted primarily in response to rapid changes in the resource mix are negatively impacting baseload coal unit performance.” In other words, coal is struggling to keep up with its changing role in the energy system. That’s due not just to the growth of solar and wind energy, which are inherently (but predictably) variable, but also to natural gas’s increasing prominence on the grid.
“When coal plants are having to be a bit more varied in their generation, we're seeing that wear and tear of those plants is increasing,” Chapman said. “The assumption is that that's only going to go up in future years.”
The issue for any plan to revitalize the coal industry, Chapman told me, is that the forces driving coal into this secondary role — namely the economics of running aging plants compared to natural gas and renewables — do not seem likely to reverse themselves any time soon.
Coal has been “sort of continuously pushed a bit more to the sidelines by renewables and natural gas being cheaper sources for utilities to generate their power. This increased marginalization is going to continue to lead to greater wear and tear on these plants,” Chapman said.
But with electricity demand increasing across the country, coal is being forced into a role that it might not be able to easily — or affordably — play, all while leading to more emissions of sulfur dioxide, nitrogen oxide, particulate matter, mercury, and, of course, carbon dioxide.
The coal system has been beset by a number of high-profile outages recently, including at the largest new coal plant in the country, Sandy Creek in Texas, which could be offline until early 2027, according to the Texas energy market ERCOT and the Institute for Energy Economics and Financial Analysis.
In at least one case, coal’s reliability issues were cited as a reason to keep another coal generating unit open past its planned retirement date.
Last month, Colorado Representative Will Hurd wrote a letter to the Department of Energy asking for emergency action to keep Unit 2 of the Comanche coal plant in Pueblo, Colorado open past its scheduled retirement at the end of his year. Hurd cited “mechanical and regulatory constraints” for the larger Unit 3 as a justification for keeping Unit 2 open, to fill in the generation gap left by the larger unit. In a filing by Xcel and several Colorado state energy officials also requesting delaying the retirement of Unit 2, they disclosed that the larger Unit 3 “experienced an unplanned outage and is offline through at least June 2026.”
Reliability issues aside, high electricity demand may turn into short-term profits at all levels of the coal industry, from the miners to the power plants.
At the same time the Trump administration is pushing coal plants to stay open past their scheduled retirement, the Energy Information Administration is forecasting that natural gas prices will continue to rise, which could lead to increased use of coal for electricity generation. The EIA forecasts that the 2025 average price of natural gas for power plants will rise 37% from 2024 levels.
Analysts at S&P Global Commodity Insights project “a continued rebound in thermal coal consumption throughout 2026 as thermal coal prices remain competitive with short-term natural gas prices encouraging gas-to-coal switching,” S&P coal analyst Wendy Schallom told me in an email.
“Stronger power demand, rising natural gas prices, delayed coal retirements, stockpiles trending lower, and strong thermal coal exports are vital to U.S. coal revival in 2025 and 2026.”
And we’re all going to be paying the price.
Rural Marylanders have asked for the president’s help to oppose the data center-related development — but so far they haven’t gotten it.
A transmission line in Maryland is pitting rural conservatives against Big Tech in a way that highlights the growing political sensitivities of the data center backlash. Opponents of the project want President Trump to intervene, but they’re worried he’ll ignore them — or even side with the data center developers.
The Piedmont Reliability Project would connect the Peach Bottom nuclear plant in southern Pennsylvania to electricity customers in northern Virginia, i.e.data centers, most likely. To get from A to B, the power line would have to criss-cross agricultural lands between Baltimore, Maryland and the Washington D.C. area.
As we chronicle time and time again in The Fight, residents in farming communities are fighting back aggressively – protesting, petitioning, suing and yelling loudly. Things have gotten so tense that some are refusing to let representatives for Piedmont’s developer, PSEG, onto their properties, and a court battle is currently underway over giving the company federal marshal protection amid threats from landowners.
Exacerbating the situation is a quirk we don’t often deal with in The Fight. Unlike energy generation projects, which are usually subject to local review, transmission sits entirely under the purview of Maryland’s Public Service Commission, a five-member board consisting entirely of Democrats appointed by current Governor Wes Moore – a rumored candidate for the 2028 Democratic presidential nomination. It’s going to be months before the PSC formally considers the Piedmont project, and it likely won’t issue a decision until 2027 – a date convenient for Moore, as it’s right after he’s up for re-election. Moore last month expressed “concerns” about the project’s development process, but has brushed aside calls to take a personal position on whether it should ultimately be built.
Enter a potential Trump card that could force Moore’s hand. In early October, commissioners and state legislators representing Carroll County – one of the farm-heavy counties in Piedmont’s path – sent Trump a letter requesting that he intervene in the case before the commission. The letter followed previous examples of Trump coming in to kill planned projects, including the Grain Belt Express transmission line and a Tennessee Valley Authority gas plant in Tennessee that was relocated after lobbying from a country rock musician.
One of the letter’s lead signatories was Kenneth Kiler, president of the Carroll County Board of Commissioners, who told me this lobbying effort will soon expand beyond Trump to the Agriculture and Energy Departments. He’s hoping regulators weigh in before PJM, the regional grid operator overseeing Mid-Atlantic states. “We’re hoping they go to PJM and say, ‘You’re supposed to be managing the grid, and if you were properly managing the grid you wouldn’t need to build a transmission line through a state you’re not giving power to.’”
Part of the reason why these efforts are expanding, though, is that it’s been more than a month since they sent their letter, and they’ve heard nothing but radio silence from the White House.
“My worry is that I think President Trump likes and sees the need for data centers. They take a lot of water and a lot of electric [power],” Kiler, a Republican, told me in an interview. “He’s conservative, he values property rights, but I’m not sure that he’s not wanting data centers so badly that he feels this request is justified.”
Kiler told me the plan to kill the transmission line centers hinges on delaying development long enough that interest rates, inflation and rising demand for electricity make it too painful and inconvenient to build it through his resentful community. It’s easy to believe the federal government flexing its muscle here would help with that, either by drawing out the decision-making or employing some other as yet unforeseen stall tactic. “That’s why we’re doing this second letter to the Secretary of Agriculture and Secretary of Energy asking them for help. I think they may be more sympathetic than the president,” Kiler said.
At the moment, Kiler thinks the odds of Piedmont’s construction come down to a coin flip – 50-50. “They’re running straight through us for data centers. We want this project stopped, and we’ll fight as well as we can, but it just seems like ultimately they’re going to do it,” he confessed to me.
Thus is the predicament of the rural Marylander. On the one hand, Kiler’s situation represents a great opportunity for a GOP president to come in and stand with his base against a would-be presidential candidate. On the other, data center development and artificial intelligence represent one of the president’s few economic bright spots, and he has dedicated copious policy attention to expanding growth in this precise avenue of the tech sector. It’s hard to imagine something less “energy dominance” than killing a transmission line.
The White House did not respond to a request for comment.
Plus more of the week’s most important fights around renewable energy.
1. Wayne County, Nebraska – The Trump administration fined Orsted during the government shutdown for allegedly killing bald eagles at two of its wind projects, the first indications of financial penalties for energy companies under Trump’s wind industry crackdown.
2. Ocean County, New Jersey – Speaking of wind, I broke news earlier this week that one of the nation’s largest renewable energy projects is now deceased: the Leading Light offshore wind project.
3. Dane County, Wisconsin – The fight over a ginormous data center development out here is turning into perhaps one of the nation’s most important local conflicts over AI and land use.
4. Hardeman County, Texas – It’s not all bad news today for renewable energy – because it never really is.