You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
Jesse teaches Rob all about where solar and wind energy come from.
The two fastest-growing sources of electricity generation in the world represent a radical break with the energy technologies that came before them. That’s not just because their fuels are the wind and the sun.
This is our third episode of Shift Key Summer School, a series of “lecture conversations” about the basics of energy, electricity, and the power grid. This week, we dive into the history and mechanics of wind turbines and solar panels, the two lynchpin technologies of the energy transition. What do solar panels have in common with semiconductors? Why did it take so long for them to achieve scale? And what’s an inverter and why is it so important for the grid of the future?
Shift Key is hosted by Jesse Jenkins, a professor of energy systems engineering at Princeton University, and Robinson Meyer, Heatmap’s executive editor.
Subscribe to “Shift Key” and find this episode on Apple Podcasts, Spotify, Amazon, YouTube, or wherever you get your podcasts.
You can also add the show’s RSS feed to your podcast app to follow us directly.
Here is an excerpt from our conversation:
Jesse Jenkins: And so then the other thing, of course, that helps is putting it at a place that’s sunnier, right? In addition to pointing it at the sun, you need to have the sun in the first place. If you go from a cloudy northern latitude to a sunny southern latitude, you’re going to get more production. That variation isn’t as large as you might think, though, from the best site in, say, Arizona and New Mexico to the worst 10th percentile sites in northern Maine or Portland, Oregon, where I grew up, where it’s very cloudy. That difference in solar resource potential is only about a factor of two. So I get about twice as much solar output from an ideally placed panel in Arizona as I do in Portland, Oregon, or Portland, Maine. That’s a lot, but we can find much better resources much closer to Portland, Maine, and Portland, Oregon, right?
And so this is why it doesn’t really make sense to build a giant solar farm in Arizona and then send all that power everywhere else in the country — because the transmission lines are so expensive and the efficiency gain is not that huge, it doesn’t make sense to send power that far away. It might make sense to put my solar panel on the east side of the Cascade Mountains and send them to Portland, Oregon, but not to go all the way to Arizona. Because the variation in solar potential is much more gradual across different locations and doesn’t span quite as much of a range as wind power, which we can talk about.
Robinson Meyer: I was going to say, this idea that solar only varies by, it sounds like, about 100% in its efficiency.
Jenkins: Or capacity factor.
Meyer: Yeah. I suspect, in fact, from previous conversations that this is going to be an important tool that comes back later — this idea that solar only really varies by 100% in its resource potential, that Arizona solar is only twice as good as Maine solar, is going to be really important after we talk about wind.
Mentioned:
How Solar Energy Became Cheap, by Gregory F. Nemet
More on what wind energy has to do with Star Trek
This episode of Shift Key is sponsored by …
Accelerate your clean energy career with Yale’s online certificate programs. Gain real-world skills, build strong networks, and keep working while you learn. Explore the year-long Financing and Deploying Clean Energy program or the 5-month Clean and Equitable Energy Development program. Learn more here.
Music for Shift Key is by Adam Kromelow.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Why regional transmission organizations as we know them might not survive the data center boom.
As the United States faces its first significant increase in electricity demand in decades, the grid itself is not only aging, but also straining against the financial, logistical, and legal barriers to adding new supply. It’s enough to make you wonder: What’s the point of an electricity market, anyway?
That’s the question some stakeholders in the PJM Interconnection, America’s largest electricity market, started asking loudly and in public in response to the grid operator’s proposal that new large energy users could become “non-capacity backed load,” i.e. be forced to turn off if ever and whenever PJM deems it necessary.
PJM, which covers 13 states from the Mid-Atlantic to the Midwest, has been America’s poster child for the struggle to get new generation online as data center development surges. PJM has warned that it will have “just enough generation to meet its reliability requirement” in 2026 and 2027, and its independent market monitor has said that the costs associated with serving that new and forecast demand have already reached the billions, translating to higher retail electricity rates in several PJM states.
As Heatmap has covered, however, basically no one in the PJM system — transmission owners, power producers, and data center developers — was happy with the details of PJM’s plan to deal with the situation. In public comments on the proposed rule, many brought up a central conflict between utilities’ historic duty to serve and the realities of the modern power market. More specifically, electricity markets like PJM are supposed to deal with wholesale electricity sales, not the kind of core questions of who gets served and when, which are left to the states.
On the power producer side, major East Coast supplier Talen Energy wrote, “The NCBL proposal exceeds PJM’s authority by establishing a regime where PJM holds the power to withhold electric service unlawfully from certain categories of large load.” The utility Exelon added that owners of transmission “have a responsibility to serve all customers—large, small, and in between. We are obligated to provide both retail and wholesale electric service safely and reliably.” And last but far from least, Microsoft, which has made itself into a leader in artificial intelligence, argued, “A PJM rule curtailing non-capacity-backed load would not only unlawfully intrude on state authority, but it would also fundamentally undercut the very purpose of PJM’s capacity market.”
This is just one small piece of a debate that’s been heating up for years, however, as more market participants, activists, and scholars question whether the markets that govern much of the U.S. electric grid are delivering power as cheaply and abundantly as they were promised to. Some have even suggested letting PJM utilities build their own power plants again, effectively reversing the market structure of the past few decades.
But questioning whether all load must be served would be an even bigger change.
The “obligation to serve all load has been a core tenet of electricity policy,” Rob Gramlich, the president of Grid Strategies LLC, told me. “I don’t recall ever seeing that be questioned or challenged in any fundamental way” — an illustration of how dire things have become.
The U.S. electricity system was designed for abundance. Utilities would serve any user, and the per-user costs of developing the fixed infrastructure necessary to serve them would drop as more users signed up.
But the planned rush of data center investments threatens to stick all ratepayers with the cost of new transmission and generation that is overwhelmingly from one class of customer. There is already a brewing local backlash to new data centers, and electricity prices have been rising faster than inflation. New data center load could also have climate consequences if utilities decide to leave aging coal online and build out new natural gas-fired power plants over and above their pre-data center boom (and pre-Trump) plans.
“AI has dramatically raised the stakes, along with enhancing worries that heightened demand will mean more burning of fossil fuels,” law professors Alexandra Klass of the University of Michigan and Dave Owen at the University of California write in a preprint paper to be published next year.
In an interview, Klass told me, “There are huge economic and climate implications if we build a whole lot of gas and keep coal on, and then demand is lower because the chips are better,” referring to the possibility that data centers and large language models could become dramatically more energy efficient, rendering the additional fossil fuel-powered supply unnecessary. Even if the projects are not fully built out or utilized, the country could face a situation where “ratepayers have already paid for [grid infrastructure], whether it’s through those wholesale markets or through their utilities in traditionally regulated states,” she said.
The core tension between AI development and the power grid, Klass and Owen argue, is the “duty to serve,” or “universal service” principle that has underlain modern electricity markets for over a century.
“The duty to serve — to meet need at pretty much all times — worked for utilities because they got to pass through their costs, and it largely worked for consumers because they didn’t have to deal very often with unpredictable blackouts,” Owen told me.
“Once you knew how to build transmission lines and build power plants,” Klass added, “there was no sense that you couldn’t continue to build to serve all customers. “We could build power plants, and the regulatory regime came up in a context where we could always build enough to meet demand.”
How and why goes back to the earliest days of electrification.
As the power industry developed in the late 19th and early 20th century, the regulated utility model emerged where monopoly utilities would build both power plants and the transmission and distribution infrastructure necessary to serve that power to customers. So that they would be able to achieve the economies of scale required to serve said customers efficiently and affordably, regulators allowed them to establish monopolies over certain service territories, with the requirement that they would serve any and everyone in them.
With a secure base of ratepayers, utilities could raise money from investors to build infrastructure, which could then be put into a “rate base” and recouped from ratepayers over time at a fixed return. In exchange, the utilities accepted regulation from state governments over their pricing and future development trajectories.
That vertically integrated system began to crack, however, as ratepayers revolted over high costs from capital investments by utilities, especially from nuclear power plants. Following the deregulation of industries such as trucking and air travel, federal regulators began to try to break up the distribution and generation portions of the electricity industry. In 1999, after some states and regions had already begun to restructure their electricity markets, the Federal Energy Regulatory Commission encouraged the creation of regional transmission organizations like PJM.
Today some 35 state electricity markets are partially or entirely restructured, with Texas operating its own, isolated electricity market beyond the reach of federal regulation. In PJM and other RTOs, electricity is (more or less) sold competitively on a wholesale basis by independent power producers to utilities, who then serve customers.
But the system as it’s constructed now may, critics argue, expose retail customers to unacceptable cost increases — and greenhouse gas emissions — as it attempts to grapple with serving new data center load.
Klass and Owen, for their part, point to other markets as models for how electricity could work that don’t involve the same assumptions of plentiful supply that electricity markets historically have, such as those governing natural gas or even Western water rights.
Interruptions of natural gas service became more common starting in the 1970s, when some natural gas services were underpriced thanks to price caps, leading to an imbalance between supply and demand. In response, regulators “established a national policy of curtailment based on end use,” Klass and Owen write, with residential users getting priority “because of their essential heating needs, followed by firm industrial and commercial customers, and finally, interruptible customers.” Natural gas was deregulated in the late 1970s and 1980s, with curtailment becoming more market-based, which also allowed natural gas customers to trade capacity with each other.
Western water rights, meanwhile, are notoriously opaque and contested — but, importantly, they are based on scarcity, and thus may provide lessons in an era of limited electricity supply. The “prior appropriation” system water markets use is, “at its core, a set of mechanisms for allocating shortage,” the authors write. Water users have “senior” and “junior” rights, with senior users “entitled to have their rights fulfilled before the holders of newer, or more ’junior,’ water rights.” These rights can be transferred, and junior users have found ways to work with what water they can get, with the authors citing extensive conservation efforts in Southern California compared to the San Francisco Bay area, which tends to have more senior rights.
With these models in mind, Klass and Owen propose a system called “demand side connect-and-manage,” whereby new loads would not necessarily get transmission and generation service at all times, and where utilities could curtail users and electricity customers would have the ability “to use trading to hedge against the risk of curtailments.”
“We can connect you now before we build a whole lot of new generation, but when we need to, we’re going to curtail you,” Klass said, describing her and Owen’s proposal.
Tyler Norris, a Duke University researcher who has published concept-defining work on data center flexibility, called the paper “one of the most important contributions yet toward the re-examination of basic assumptions of U.S. electricity law that’s urgently needed as hyperscale load growth pushes our existing regulatory system beyond its limits.”
While electricity may not be literally drying up, he told me, “when you are supply side constrained while demand is growing, you have this challenge of, how do you allocate scarcity?”
Unlike the PJM proposals, “Our paper was very focused on state law,” Klass told me. “And that was intentional, because I think this is trickier at the federal level,” she told me.
Some states are already embracing similar ideas. Ohio regulators, for instance, established a data center tariff that tries to protect customers from higher costs by forcing data centers to make minimum payments regardless of their actual electricity use. Texas also passed a law that would allow for some curtailment of large loads and reforms of the interconnection process to avoid filling up the interconnection queue with speculative projects that could result in infrastructure costs but not real electricity demand.
Klass and Owen write that their idea may be more of “a temporary bridging strategy, primarily for periods when peak demand outstrips supply or at least threatens to do so.”
Even those who don’t think the principles underlying electricity markets need to be rethought see the need — at least in the short term — for new options for large new power users who may not get all the power they want all of the time.
“Some non-firm options are necessary in the short term,” Gramlich told me, referring to ideas like Klass and Owen’s, Norris’s, and PJM’s. “Some of them are going to have some legal infirmities and jurisdictional problems. But I think no matter what, we’re going to see some non-firm options. A lot of customers, a lot of these large loads, are very interested, even if it’s a temporary way to get connected while they try to get the firm service later.”
If electricity markets have worked for over one hundred years on the principle that more customers could bring down costs for everyone, going forward, we may have to get more choosy — or pay the price.
A judge has lifted the administration’s stop-work order against Revolution Wind.
A federal court has lifted the Trump administration’s order to halt construction on the Revolution Wind farm off the coast of New England. The decision marks the renewables industry’s first major legal victory against a federal war on offshore wind.
The Interior Department ordered Orsted — the Danish company developing Revolution Wind — to halt construction of Revolution Wind on August 22, asserting in a one-page letter that it was “seeking to address concerns related to the protection of national security interests of the United States and prevention of interference with reasonable uses of the exclusive economic zone, the high seas, and the territorial seas.”
In a two-page ruling issued Monday, U.S. District Judge Royce Lamberth found that Orsted would presumably win its legal challenge against the stop work order, and that the company is “likely to suffer irreparable harm in the absence of an injunction,” which led him to lift the dictate from the Trump administration.
Orsted previously claimed in legal filings that delays from the stop work order could put the entire project in jeopardy by pushing its timeline beyond the terms of existing power purchase agreements, and that the company installing cable for the project only had a few months left to work on Revolution Wind before it had to move onto other client obligations through mid-2028. The company has also argued that the Trump administration is deliberately mischaracterizing discussions between the federal government and the company that took place before the project was fully approved.
It’s still unclear at this moment whether the Trump administration will appeal the decision. We’re still waiting on the outcome of a separate legal challenge brought by Democrat-controlled states against Trump’s anti-wind Day One executive order.
Harmonizing data across federal agencies will go a long, long way toward simplifying environmental reviews.
Comprehensive permitting reform remains elusive.
In spite of numerous promising attempts — the Fiscal Responsibility Act of 2023, for instance, which delivered only limited improvements, and the failed Manchin-Barrasso bill of last year — the U.S. has repeatedly failed to overhaul its clogged federal infrastructure approval process. Even now there are draft bills and agreements in principle, but the Trump administration’s animus towards renewable energy has undermined Democratic faith in any deal. Less obvious but no less important, key Republicans are quietly disengaged, hesitant to embrace the federal transmission reform that negotiators see as essential to the package.
Despite this grim prognosis, Congress could still improve implementation of a key permitting barrier, the National Environmental Policy Act, by fixing the federal government’s broken systems for managing and sharing NEPA documentation and data. These opaque and incompatible systems frustrate essential interagency coordination, contributing immeasurably to NEPA’s delays and frustrations. But it’s a problem with clear, available, workable solutions — and at low political cost.
Both of us saw these problems firsthand. Marc helped manage NEPA implementation at the Environmental Protection Agency, observing the federal government’s slow and often flailing attempts to use technology to improve internal agency processes. Elizabeth, meanwhile, spent two years overcoming NEPA’s atomized data ecosystem to create a comprehensive picture of NEPA litigation.
Even so, it’s difficult to illustrate the scope of the problem without experiencing it. Some agencies have bespoke systems to house crucial and unique geographic information on project areas. Other agencies lack ready access to that information, even as they examine project impacts another agency may have already studied. Similarly, there is no central database of scientific studies undertaken in support of environmental reviews. Some agencies maintain repositories for their environmental assessments — arduous but less intense environmental reviews than the environmental impact statements NEPA requires when a federal agency action substantially impacts the environment. But there’s still no unified, cross-agency EA database. This leaves agencies unable to efficiently find and leverage work that could inform their own reviews. Indeed, agencies may be duplicating or re-duplicating tedious, time-consuming efforts.
NEPA implementation also relies on interagency cooperation. There, too, agencies’ divergent ways of classifying and communicating about project data throws up impediments. Agencies rely on arcane data formats and often incompatible platforms. (For the tech-savvy, an agency might have a PDF-only repository while another has XML-based data formats.) With few exceptions, it’s difficult for cooperating agencies to even know the status of a given review. And it produces a comedy of errors for agencies trying to recruit and develop younger, tech-savvy staff. Your workplace might use something like Asana or Trello to guide your workflow, a common language all teams use to communicate. The federal government has a bureaucratic Tower of Babel.
Yet another problem, symptomatic of inadequate transparency, is that we have only limited data on the thousands of NEPA court cases. To close the gap, we sought to understand — using data — just how sprawling and unwieldy post-review NEPA litigation had become. We read every available district and appellate opinion that mentioned NEPA from 2013 to 2022 (over 2,000 cases), screened out those without substantive NEPA claims, and catalogued their key characteristics — plaintiffs, court timelines and outcomes, agencies, project types, and so on. Before we did this work, no national NEPA litigation database provided policymakers with actionable, data-driven insights into court outcomes for America’s most-litigated environmental statute. But even our painstaking efforts couldn’t unearth a full dataset that included, for example, decisions taken by administrative judges within agencies.
We can’t manage what we can’t measure. And every study in this space, including ours, struggles with this type of sample bias. Litigated opinions are neither random nor representative; they skew toward high-stakes disputes with uncertain outcomes and underrepresent cases that settle on clear agency error or are dismissed early for weak claims. Our database illuminates litigation patterns and timelines. But like the rest of the literature, it cannot offer firm conclusions about NEPA’s effectiveness. We need a more reliable universe of all NEPA reviews to have any chance — even a flawed one — at assessing the law’s outcomes.
In the meantime, NEPA policy debates often revolve unproductively around assumptions and anecdotes. For example, Democrats can point to instances when early and robust public engagement appeared essential for bringing projects to completion. But in the absence of hard data to support this view, GOP reformers often prefer to limit public participation in the name of speeding the review process. The rebuttal to that approach is persuasive: Failing to engage potential project opponents on their legitimate concerns merely drives them to interfere with the project outside the NEPA process. Yet this rebuttal relies on assumptions, not evidence. Only transparent data can resolve the dispute.
Some of the necessary repair work is already underway at the Council on Environmental Quality, the White House entity that coordinates and guides agencies’ NEPA implementation. In May, CEQ published a “NEPA and Permitting Data and Technology Standard” so that agencies could voluntarily align on how to communicate NEPA information with each other. Then in June, after years using a lumbering Excel file containing agencies’ categorical exclusions — the types of projects that don’t need NEPA review, as determined by law or regulation — CEQ unveiled a searchable database called the Categorical Exclusion Explorer. The Pacific Northwest National Laboratory’s PermitAI has leveraged the EPA’s repository of environmental impact statements and, more recently, environmental review documents from other agencies to create an AI-powered queryable database. The FAST-41 Dashboard has brought transparency and accountability to a limited number of EISs.
But across all these efforts, huge gaps in data, resources, and enforcement authority remain. President Trump has issued directives to agencies to speed environmental reviews, evincing an interest in filling the gaps. But those directives don’t and can’t compel the full scope of necessary technological changes.
Some members of Congress are tuned in and trying to do something about this. Representatives Scott Peters, a Democrat from California, and Dusty Johnson, Republican of South Dakota, deserve credit for introducing the bipartisan ePermit Act to address all of these challenges. They’ve identified key levers to improve interagency communication, track litigation, and create a common and publicly accessible storehouse of NEPA data. Crucially, they recognize the make-or-break role of agency Chief Information Officers who are accountable for information security. Our own attempts to upgrade agency technology taught us that the best way to do so is by working with — not around — CIOs who have a statutory mandate.
The ePermit Act would also lay the groundwork for more extensive and innovative deployment of artificial intelligence in NEPA processes. Despite AI’s continuing challenges around information accuracy and traceability, large language models may eventually be able to draft the majority of an EIS on their own, with humans involved to oversee.
AI can also address hidden pain points in the NEPA process. It can hasten the laborious summarization and incorporation of public comment, reducing the legal and practical risk that agencies miss crucial public feedback. It can also help determine whether sponsor applications are complete, frequently a point of friction between sponsors and agencies. AI can also assess whether projects could be adapted to a categorical exclusion, entirely removing unnecessary reviews. And finally, AI tools are a concession to the rapid turnover of NEPA personnel and depleted institutional knowledge — an acute problem of late.
Comprehensive, multi-agency legislation like the ePermit Act will take time to implement — Congress may want or even need to reform NEPA before we get the full benefit of technology improvements. But that does not diminish the urgency or value of this effort. Even Representative Jared Huffman of California, a key Democrat on the House Natural Resources Committee with impeccable environmental credentials, offered words of support for the ePermit Act, while opposing other NEPA reforms.
Regardless of what NEPA looks like in coming years, this work must begin at some point. Under every flavor of NEPA reform, agencies will need to share data, coordinate across platforms, and process information. That remains true even as court-driven legal reforms and Trump administration regulatory changes wreak havoc with NEPA’s substance and implementation. Indeed, whether or not courts, Congress, or the administration reduce NEPA’s reach, even truncated reviews would still be handicapped by broken systems. Fixing the technology infrastructure now is a way to future-proof NEPA.
The solution won’t be as simple as getting agencies to use Microsoft products. It’s long past time to give agencies the tools they need — an interoperable, government-wide platform for NEPA data and project management, supported by large language models. This is no simple task. To reap the full benefits of these solutions will require an act of Congress that both provides funding for multi-agency software and requires all agencies to act in concert. This mandate is necessary to induce movement from actors within agencies who are slow to respond to non-binding CEQ directives that take time away from statutorily required work, or those who resist discretionary changes to agency software as cybersecurity risks, no matter how benign those changes may be. Without appropriated money or congressional edict, the government’s efforts in this area will lack the resources and enforcement levers to ensure reforms take hold.
Technology improvements won’t cure everything that ails NEPA. This bill won’t fix the deep uncertainty unleashed by the legal chaos of the last year. But addressing these issues is a no-regrets move with bipartisan and potentially even White House support. Let it be done.