You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
Why regional transmission organizations as we know them might not survive the data center boom.
As the United States faces its first significant increase in electricity demand in decades, the grid itself is not only aging, but also straining against the financial, logistical, and legal barriers to adding new supply. It’s enough to make you wonder: What’s the point of an electricity market, anyway?
That’s the question some stakeholders in the PJM Interconnection, America’s largest electricity market, started asking loudly and in public in response to the grid operator’s proposal that new large energy users could become “non-capacity backed load,” i.e. be forced to turn off if ever and whenever PJM deems it necessary.
PJM, which covers 13 states from the Mid-Atlantic to the Midwest, has been America’s poster child for the struggle to get new generation online as data center development surges. PJM has warned that it will have “just enough generation to meet its reliability requirement” in 2026 and 2027, and its independent market monitor has said that the costs associated with serving that new and forecast demand have already reached the billions, translating to higher retail electricity rates in several PJM states.
As Heatmap has covered, however, basically no one in the PJM system — transmission owners, power producers, and data center developers — was happy with the details of PJM’s plan to deal with the situation. In public comments on the proposed rule, many brought up a central conflict between utilities’ historic duty to serve and the realities of the modern power market. More specifically, electricity markets like PJM are supposed to deal with wholesale electricity sales, not the kind of core questions of who gets served and when, which are left to the states.
On the power producer side, major East Coast supplier Talen Energy wrote, “The NCBL proposal exceeds PJM’s authority by establishing a regime where PJM holds the power to withhold electric service unlawfully from certain categories of large load.” The utility Exelon added that owners of transmission “have a responsibility to serve all customers—large, small, and in between. We are obligated to provide both retail and wholesale electric service safely and reliably.” And last but far from least, Microsoft, which has made itself into a leader in artificial intelligence, argued, “A PJM rule curtailing non-capacity-backed load would not only unlawfully intrude on state authority, but it would also fundamentally undercut the very purpose of PJM’s capacity market.”
This is just one small piece of a debate that’s been heating up for years, however, as more market participants, activists, and scholars question whether the markets that govern much of the U.S. electric grid are delivering power as cheaply and abundantly as they were promised to. Some have even suggested letting PJM utilities build their own power plants again, effectively reversing the market structure of the past few decades.
But questioning whether all load must be served would be an even bigger change.
The “obligation to serve all load has been a core tenet of electricity policy,” Rob Gramlich, the president of Grid Strategies LLC, told me. “I don’t recall ever seeing that be questioned or challenged in any fundamental way” — an illustration of how dire things have become.
The U.S. electricity system was designed for abundance. Utilities would serve any user, and the per-user costs of developing the fixed infrastructure necessary to serve them would drop as more users signed up.
But the planned rush of data center investments threatens to stick all ratepayers with the cost of new transmission and generation that is overwhelmingly from one class of customer. There is already a brewing local backlash to new data centers, and electricity prices have been rising faster than inflation. New data center load could also have climate consequences if utilities decide to leave aging coal online and build out new natural gas-fired power plants over and above their pre-data center boom (and pre-Trump) plans.
“AI has dramatically raised the stakes, along with enhancing worries that heightened demand will mean more burning of fossil fuels,” law professors Alexandra Klass of the University of Michigan and Dave Owen at the University of California write in a preprint paper to be published next year.
In an interview, Klass told me, “There are huge economic and climate implications if we build a whole lot of gas and keep coal on, and then demand is lower because the chips are better,” referring to the possibility that data centers and large language models could become dramatically more energy efficient, rendering the additional fossil fuel-powered supply unnecessary. Even if the projects are not fully built out or utilized, the country could face a situation where “ratepayers have already paid for [grid infrastructure], whether it’s through those wholesale markets or through their utilities in traditionally regulated states,” she said.
The core tension between AI development and the power grid, Klass and Owen argue, is the “duty to serve,” or “universal service” principle that has underlain modern electricity markets for over a century.
“The duty to serve — to meet need at pretty much all times — worked for utilities because they got to pass through their costs, and it largely worked for consumers because they didn’t have to deal very often with unpredictable blackouts,” Owen told me.
“Once you knew how to build transmission lines and build power plants,” Klass added, “there was no sense that you couldn’t continue to build to serve all customers. “We could build power plants, and the regulatory regime came up in a context where we could always build enough to meet demand.”
How and why goes back to the earliest days of electrification.
As the power industry developed in the late 19th and early 20th century, the regulated utility model emerged where monopoly utilities would build both power plants and the transmission and distribution infrastructure necessary to serve that power to customers. So that they would be able to achieve the economies of scale required to serve said customers efficiently and affordably, regulators allowed them to establish monopolies over certain service territories, with the requirement that they would serve any and everyone in them.
With a secure base of ratepayers, utilities could raise money from investors to build infrastructure, which could then be put into a “rate base” and recouped from ratepayers over time at a fixed return. In exchange, the utilities accepted regulation from state governments over their pricing and future development trajectories.
That vertically integrated system began to crack, however, as ratepayers revolted over high costs from capital investments by utilities, especially from nuclear power plants. Following the deregulation of industries such as trucking and air travel, federal regulators began to try to break up the distribution and generation portions of the electricity industry. In 1999, after some states and regions had already begun to restructure their electricity markets, the Federal Energy Regulatory Commission encouraged the creation of regional transmission organizations like PJM.
Today some 35 state electricity markets are partially or entirely restructured, with Texas operating its own, isolated electricity market beyond the reach of federal regulation. In PJM and other RTOs, electricity is (more or less) sold competitively on a wholesale basis by independent power producers to utilities, who then serve customers.
But the system as it’s constructed now may, critics argue, expose retail customers to unacceptable cost increases — and greenhouse gas emissions — as it attempts to grapple with serving new data center load.
Klass and Owen, for their part, point to other markets as models for how electricity could work that don’t involve the same assumptions of plentiful supply that electricity markets historically have, such as those governing natural gas or even Western water rights.
Interruptions of natural gas service became more common starting in the 1970s, when some natural gas services were underpriced thanks to price caps, leading to an imbalance between supply and demand. In response, regulators “established a national policy of curtailment based on end use,” Klass and Owen write, with residential users getting priority “because of their essential heating needs, followed by firm industrial and commercial customers, and finally, interruptible customers.” Natural gas was deregulated in the late 1970s and 1980s, with curtailment becoming more market-based, which also allowed natural gas customers to trade capacity with each other.
Western water rights, meanwhile, are notoriously opaque and contested — but, importantly, they are based on scarcity, and thus may provide lessons in an era of limited electricity supply. The “prior appropriation” system water markets use is, “at its core, a set of mechanisms for allocating shortage,” the authors write. Water users have “senior” and “junior” rights, with senior users “entitled to have their rights fulfilled before the holders of newer, or more ’junior,’ water rights.” These rights can be transferred, and junior users have found ways to work with what water they can get, with the authors citing extensive conservation efforts in Southern California compared to the San Francisco Bay area, which tends to have more senior rights.
With these models in mind, Klass and Owen propose a system called “demand side connect-and-manage,” whereby new loads would not necessarily get transmission and generation service at all times, and where utilities could curtail users and electricity customers would have the ability “to use trading to hedge against the risk of curtailments.”
“We can connect you now before we build a whole lot of new generation, but when we need to, we’re going to curtail you,” Klass said, describing her and Owen’s proposal.
Tyler Norris, a Duke University researcher who has published concept-defining work on data center flexibility, called the paper “one of the most important contributions yet toward the re-examination of basic assumptions of U.S. electricity law that’s urgently needed as hyperscale load growth pushes our existing regulatory system beyond its limits.”
While electricity may not be literally drying up, he told me, “when you are supply side constrained while demand is growing, you have this challenge of, how do you allocate scarcity?”
Unlike the PJM proposals, “Our paper was very focused on state law,” Klass told me. “And that was intentional, because I think this is trickier at the federal level,” she told me.
Some states are already embracing similar ideas. Ohio regulators, for instance, established a data center tariff that tries to protect customers from higher costs by forcing data centers to make minimum payments regardless of their actual electricity use. Texas also passed a law that would allow for some curtailment of large loads and reforms of the interconnection process to avoid filling up the interconnection queue with speculative projects that could result in infrastructure costs but not real electricity demand.
Klass and Owen write that their idea may be more of “a temporary bridging strategy, primarily for periods when peak demand outstrips supply or at least threatens to do so.”
Even those who don’t think the principles underlying electricity markets need to be rethought see the need — at least in the short term — for new options for large new power users who may not get all the power they want all of the time.
“Some non-firm options are necessary in the short term,” Gramlich told me, referring to ideas like Klass and Owen’s, Norris’s, and PJM’s. “Some of them are going to have some legal infirmities and jurisdictional problems. But I think no matter what, we’re going to see some non-firm options. A lot of customers, a lot of these large loads, are very interested, even if it’s a temporary way to get connected while they try to get the firm service later.”
If electricity markets have worked for over one hundred years on the principle that more customers could bring down costs for everyone, going forward, we may have to get more choosy — or pay the price.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
The administration seems to be pursuing a “some of the above” strategy with little to no internal logic.
The Department of Energy justified terminating hundreds of congressionally-mandated grants issued by the Biden administration for clean energy projects last week (including for a backup battery at a children’s hospital) by arguing that they were bad investments for the American people.
“Following a thorough, individualized financial review, DOE determined that these projects did not adequately advance the nation’s energy needs, were not economically viable, and would not provide a positive return on investment of taxpayer dollars,” the agency’s press release said.
It’s puzzling, then, that the Trump administration is pouring vast government resources into saving aging coal plants and expediting advanced nuclear projects — two sources of energy that are famously financial black holes.
The Energy Department announced it would invest $625 million to “reinvigorate and expand America’s coal industry” in late September. Earlier this year, the agency also made $900 million available to “unlock commercial deployment of American-made small modular reactors.”
It’s hard to imagine what economic yardsticks would warrant funding to keep coal plants open. The cost of operating a coal plant in the U.S. has increased by nearly 30% since 2021 — faster than inflation — according to research by Energy Innovation. Driving that increase is the cost of coal itself, as well as the fact that the nation’s coal plants are simply getting very old and more expensive to maintain. “You can put all the money you want into a clunker, but at the end of the day, it’s really old, and it’s just going to keep getting more expensive over time, even if you have a short term fix,” Michelle Solomon, a program manager at Energy Innovation who authored the research, told me.
Keeping these plants online — even if they only operate some of the time— inevitably raises electricity bills. That’s because in many of the country’s electricity markets, the cost of power on any given day is determined by the most expensive plant running. On a hot summer day when everyone’s air conditioners are working hard and the grid operator has to tell a coal plant to switch on to meet demand, every electron delivered in the region will suddenly cost the same as coal, even if it was generated essentially for free by the sun or wind.
The Trump administration has also based its support for coal plants on the idea that they are needed for reliability. In theory, coal generation should be available around the clock. But in reality, the plants aren’t necessarily up to the task — and not just because they’re old. Sandy Creek in Texas, which began operating in 2013 and is the newest coal plant in the country, experienced a major failure this past April and is now expected to stay offline until 2027, according to the region’s grid operator. In a report last year, the North American Electric Reliability Corporation warned that outage rates for coal plants are increasing. This is in part due to wear and tear from the way these plants cycle on and off to accommodate renewable energy sources, the report said, but it’s also due to reduced maintenance as plant operators plan to retire the facilities.
“You can do the deferred maintenance. It might keep the plant operating for a bit longer, but at the end of the day, it’s still not going to be the most efficient source of energy, or the cheapest source of energy,” Solomon said.
The contradictions snowball from there. On September 30, the DOE opened a $525 million funding opportunity for coal plants titled “Restoring Reliability: Coal Recommissioning and Modernization,” inviting coal-fired power plants that are scheduled for retirement before 2032 or in rural areas to apply for grants that will help keep them open. The grant paperwork states that grid capacity challenges “are especially acute in regions with constrained transmission and sustained load growth.” Two days later, however, as part of the agency’s mass termination of grants, it canceled more than $1.3 billion in awards from the Grid Deployment Office to upgrade and install new transmission lines to ease those constraints.
The new funding opportunity may ultimately just shuffle awards around from one coal plant to another, or put previously-awarded projects through the time-and-money-intensive process of reapplying for the same funding under a new name. Up to $350 million of the total will go to as many as five coal plants, with initial funding to restart closed plants or to modernize old ones, and later phases designated for carbon capture, utilization, and storage retrofits. The agency said it will use “unobligated” money from three programs that were part of the 2021 Infrastructure Investment and Jobs Act: the Carbon Capture Demonstration Projects Program, the Carbon Capture Large-Scale Pilot Projects, and the Energy Improvements in Rural or Remote Areas Program.
In a seeming act of cognitive dissonance, however, the agency has canceled awards for two coal-fired power plants that the Biden administration made under those same programs. One, a $6.5 million grant to Navajo Transitional Energy Company, a tribal-owned entity that owns a stake in New Mexico’s Four Corners Generating Station, would have funded a study to determine whether adding carbon capture and storage to the plant was economically viable. The other, a $50 million grant to TDA Research that would have helped the company validate its CCS technology at Dry Fork Station, a coal plant in Wyoming, was terminated in May.
Two more may be out the window. A new internal agency list of grants labeled “terminate” that circulated this week included an $8 million grant for the utility Duke Energy to evaluate the feasibility of capturing carbon from its Edwardsport plant in Indiana, and $350 million for Project Tundra, a carbon capture demonstration project at the Milton R. Young Station in North Dakota.
“It’s not internally consistent,” Jack Andreason Cavanaugh, a global fellow at the Columbia University’s Carbon Management Research Initiative, told me. “You’re canceling coal grants, but then you’re giving $630 million to keep them open. You’re also investing a ton of time and money into nuclear — which is great, to be clear — but these small modular reactors haven’t been deployed in the United States, and part of the reason is that they’re currently not economically viable.”
The closest any company has come thus far to deploying a small modular reactor in the U.S. is NuScale, a company that planned to build its first-of-a-kind reactors in Idaho and had secured agreements to sell the power to a group of public utilities in Utah. But between 2015, when it was first proposed, and late 2023, when it died, the project’s budget tripled from $3 billion to more than $9 billion, while its scale was reduced from 600 megawatts to 462 megawatts. Not all of that was inevitable — costs rose dramatically in the final few years due to inflation. The reason NuScale ultimately pulled out of the project is that the cost of electricity it generated was going to be too high for the market to bear.
It’s unclear how heavily the DOE will weigh project financials in the application process for the $900 million for nuclear reactors. In its funding announcement, it specified that the awards would be made “solely based on technical merit.” The agency’s official solicitation paperwork, however, names “financial viability” as one of the key review criteria. Regardless, the Trump administration appears to recognize the value in funding first-of-a-kind, risky technologies when it comes to nuclear, but is not applying the same standards to direct air capture or hydrogen plants.
I asked the Department of Energy to share the criteria it used in the project review process to determine economic viability. In response, spokesperson Ben Dietderich encouraged me to read Wright’s memorandum describing the review process from May. The memo outlines what types of documentation the agency will evaluate to reach a decision, but not the criteria for making that decision.
Solomon agreed that advanced nuclear might one day meet the grid’s growing power needs, but not anytime soon. “Hopefully in the long term, this technology does become a part of our electricity system. But certainly relying on it in the short term has real risks to electricity costs,” she said. “And also reliability, in the sense that the projects might not materialize.”
The collateral damage from the Lava Ridge wind project might now include a proposed 285-mile transmission line initially approved by federal regulators in the 1990s.
The same movement that got Trump to kill the Lava Ridge wind farm Trump killed has appeared to derail a longstanding transmission project that’s supposed to connect sought-after areas for wind energy in Idaho to power-hungry places out West.
The Southwest Intertie Project-North, also known as SWIP-N, is a proposed 285-mile transmission line initially approved by federal regulators in the 1990s. If built, SWIP-N is supposed to feed power from the wind-swept plains of southern Idaho to the Southwest, while shooting electrons – at least some generated from solar power – back up north into Idaho from Nevada, Utah, and Arizona. In California, regulators have identified the line as crucial for getting cleaner wind energy into the state’s grid to meet climate goals.
But on Tuesday, SWIP-N suddenly faced a major setback: The three-person commission representing Jerome County, Idaho – directly in the path of the project – voted to revoke its special use permit, stating the company still lacked proper documentation to meet the terms and conditions of the approval. SWIP-N had the wind at its back as recently as last year, when LS Power expected it to connect to Lava Ridge and other wind farms that have been delayed by Trump’s federal permitting freeze on renewable energy. But now, the transmission line has stuttered along with this potential generation.
At a hearing Tuesday evening, county commissioners said Great Basin Transmission, a subsidiary of LS Power developing the line, would now suddenly need new input, including the blessing of the local highway district and potential feedback from the Federal Aviation Administration. Jerome County Commissioner Charles Howell explained to me Wednesday afternoon that there will still need to be formal steps remanding the permit, and the process will go back to local zoning officials. Great Basin Transmission will then at minimum need to get the sign-offs from local highway officials to satisfy his concerns, as well as those of the other commissioner who voted to rescind the permit, Ben Crouch.
The permit was many years old, and there are outstanding questions about what will happen next procedurally, including what Great Basin Transmission is actually able to do to fight this choice by the commissioners. At minimum, staff for the commission will write a formal decision explaining the reasoning and remand the permit. After that, it’ll be up to Great Basin Transmission to produce the documents that commissioners want. “Even our attorney and staff didn’t have those answers when we asked that after the vote,” Howell said, adding that he hopes the issues can be resolved. “I was on the county commission about when they decided where to site the towers, where to site the right-of-ways. That’s all been there a long time.”
This is the part where I bring up how Jerome County’s decision followed a months-long fight by aggrieved residents who opposed the SWIP-N line, including homeowners who say they didn’t know their properties were in the path of the project. There’s also a significant anti-wind undercurrent, as many who are fighting this transmission line previously fought LS Power’s Lava Ridge wind project, which was blocked by and executive order from President Donald Trump on his first day in office. Jerome County itself passed an ordinance in May requiring any renewable energy facility to get all federal, state, and local approvals before it would sign off on new projects.
Opposition to SWIP-N comes from a similar place as the “Stop Lava Ridge” campaign. Along with viewshed anxieties and property value impacts, SWIP-N, like Lava Ridge, would be within single-digit miles of the Minidoka National Historic Site, a former prison camp that held Japanese-Americans during World War II. In the eyes of its staunchest critics, constructing the wind farm would’ve completely damaged any impact of visiting the site by filling the surroundings of what is otherwise a serene, somber scene. Descendants of Minidoka detainees lobbied politicians at all levels to oppose Lava Ridge, a cause that was ultimately championed by Republican politicians in their fight against the project.
These same descendants of Japanese-American detainees have fought the transmission line, arguing that its construction would inevitably lead to new wind projects. “If approved, the SWIP-N line would enable LS Power and other renewable energy companies to build massive wind projects on federal land in and around Jerome County in future years,” wrote Dan Sakura, the son of a Minidoka prisoner, in a September 15 letter to the commission.
Sakura had been a leading voice in the fight against Lava Ridge. When I asked why he was weighing in on SWIP-N, he told me over text message, “The Lava Ridge wind project poisoned the well for renewable energy projects on federal land in Southern Idaho.”
LS Power did not respond to a request for comment.
It’s worth noting that efforts have already been made to avoid SWIP-N’s impacts to the Minidoka National Historic Site. In 2010, Congress required the Interior Secretary to re-do the review process for the transmission line, which at the time was proposed to go through the historic site. The route rejected by Jerome County would go around.
There is also no guarantee that wind energy will flock to southern Idaho any time soon. Yes, there’s a Trump permitting freeze, and federal wind energy tax credits are winding down. That’s almost certainly why the developers of small nuclear reactors have reportedly coveted the Lava Ridge site for future projects. But there’s also incredible hostility pent up against wind partially driven by the now-defunct LS Power project, for instance in Lincoln County, where officials now have an emergency moratorium banning wind energy while they develop a more permanent restrictive ordinance.
Howell made no bones about his own views on wind farms, telling me he prefers battery storage and nuclear power. “As I stand here in my backyard, if they put up windmills, that’s all I’m going to see for 40 miles,” he said
But Howell did confess to me that he thinks SWIP-N will ultimately be built – if the company is able to get these new sign-offs. What kind of energy flows through a transmission line cannot ultimately affect the decision on the special use permit because, he said, “there are rules.” On top of that, Idaho is going to ultimately need more power no matter what, and at the very least, the state will have to get electrons from elsewhere.
Howell’s “non-political” answer to the fate of SWIP-N, as he put it to me, is that “We live on power, so we gotta have more power.”
The week’s most important news around renewable project fights.
1. Western Nevada — The Esmeralda 7 solar mega-project may be no more.
2. Washoe County, Nevada – Elsewhere in Nevada, the Greenlink North transmission line has been delayed by at least another month.
3. Oconto County, Wisconsin – Solar farm town halls are now sometimes getting too scary for developers to show up at.
4. Apache County, Arizona – In brighter news, this county looks like it will give its first-ever conditional use permit for a large solar farm, EDF Renewables’ Juniper Spring project.
5. Putnam County, Indiana – After hearing about what happened here this week, I’m fearful for any solar developer trying to work in Indiana.
6. Tippecanoe County, Indiana – Two counties to the north of Putnam is a test case for the impacts a backlash on solar energy can have on data centers.