Sign In or Create an Account.

By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy

Energy

Electricity Markets Aren’t Working Anymore

Why regional transmission organizations as we know them might not survive the data center boom.

Data servers and graphs.
Heatmap Illustration/Getty Images

As the United States faces its first significant increase in electricity demand in decades, the grid itself is not only aging, but also straining against the financial, logistical, and legal barriers to adding new supply. It’s enough to make you wonder: What’s the point of an electricity market, anyway?

That’s the question some stakeholders in the PJM Interconnection, America’s largest electricity market, started asking loudly and in public in response to the grid operator’s proposal that new large energy users could become “non-capacity backed load,” i.e. be forced to turn off if ever and whenever PJM deems it necessary.

PJM, which covers 13 states from the Mid-Atlantic to the Midwest, has been America’s poster child for the struggle to get new generation online as data center development surges. PJM has warned that it will have “just enough generation to meet its reliability requirement” in 2026 and 2027, and its independent market monitor has said that the costs associated with serving that new and forecast demand have already reached the billions, translating to higher retail electricity rates in several PJM states.

As Heatmap has covered, however, basically no one in the PJM system — transmission owners, power producers, and data center developers — was happy with the details of PJM’s plan to deal with the situation. In public comments on the proposed rule, many brought up a central conflict between utilities’ historic duty to serve and the realities of the modern power market. More specifically, electricity markets like PJM are supposed to deal with wholesale electricity sales, not the kind of core questions of who gets served and when, which are left to the states.

On the power producer side, major East Coast supplier Talen Energy wrote, “The NCBL proposal exceeds PJM’s authority by establishing a regime where PJM holds the power to withhold electric service unlawfully from certain categories of large load.” The utility Exelon added that owners of transmission “have a responsibility to serve all customers—large, small, and in between. We are obligated to provide both retail and wholesale electric service safely and reliably.” And last but far from least, Microsoft, which has made itself into a leader in artificial intelligence, argued, “A PJM rule curtailing non-capacity-backed load would not only unlawfully intrude on state authority, but it would also fundamentally undercut the very purpose of PJM’s capacity market.”

This is just one small piece of a debate that’s been heating up for years, however, as more market participants, activists, and scholars question whether the markets that govern much of the U.S. electric grid are delivering power as cheaply and abundantly as they were promised to. Some have even suggested letting PJM utilities build their own power plants again, effectively reversing the market structure of the past few decades.

But questioning whether all load must be served would be an even bigger change.

The “obligation to serve all load has been a core tenet of electricity policy,” Rob Gramlich, the president of Grid Strategies LLC, told me. “I don’t recall ever seeing that be questioned or challenged in any fundamental way” — an illustration of how dire things have become.

The U.S. electricity system was designed for abundance. Utilities would serve any user, and the per-user costs of developing the fixed infrastructure necessary to serve them would drop as more users signed up.

But the planned rush of data center investments threatens to stick all ratepayers with the cost of new transmission and generation that is overwhelmingly from one class of customer. There is already a brewing local backlash to new data centers, and electricity prices have been rising faster than inflation. New data center load could also have climate consequences if utilities decide to leave aging coal online and build out new natural gas-fired power plants over and above their pre-data center boom (and pre-Trump) plans.

“AI has dramatically raised the stakes, along with enhancing worries that heightened demand will mean more burning of fossil fuels,” law professors Alexandra Klass of the University of Michigan and Dave Owen at the University of California write in a preprint paper to be published next year.

In an interview, Klass told me, “There are huge economic and climate implications if we build a whole lot of gas and keep coal on, and then demand is lower because the chips are better,” referring to the possibility that data centers and large language models could become dramatically more energy efficient, rendering the additional fossil fuel-powered supply unnecessary. Even if the projects are not fully built out or utilized, the country could face a situation where “ratepayers have already paid for [grid infrastructure], whether it’s through those wholesale markets or through their utilities in traditionally regulated states,” she said.

The core tension between AI development and the power grid, Klass and Owen argue, is the “duty to serve,” or “universal service” principle that has underlain modern electricity markets for over a century.

“The duty to serve — to meet need at pretty much all times — worked for utilities because they got to pass through their costs, and it largely worked for consumers because they didn’t have to deal very often with unpredictable blackouts,” Owen told me.

“Once you knew how to build transmission lines and build power plants,” Klass added, “there was no sense that you couldn’t continue to build to serve all customers. “We could build power plants, and the regulatory regime came up in a context where we could always build enough to meet demand.”

How and why goes back to the earliest days of electrification.

As the power industry developed in the late 19th and early 20th century, the regulated utility model emerged where monopoly utilities would build both power plants and the transmission and distribution infrastructure necessary to serve that power to customers. So that they would be able to achieve the economies of scale required to serve said customers efficiently and affordably, regulators allowed them to establish monopolies over certain service territories, with the requirement that they would serve any and everyone in them.

With a secure base of ratepayers, utilities could raise money from investors to build infrastructure, which could then be put into a “rate base” and recouped from ratepayers over time at a fixed return. In exchange, the utilities accepted regulation from state governments over their pricing and future development trajectories.

That vertically integrated system began to crack, however, as ratepayers revolted over high costs from capital investments by utilities, especially from nuclear power plants. Following the deregulation of industries such as trucking and air travel, federal regulators began to try to break up the distribution and generation portions of the electricity industry. In 1999, after some states and regions had already begun to restructure their electricity markets, the Federal Energy Regulatory Commission encouraged the creation of regional transmission organizations like PJM.

Today some 35 state electricity markets are partially or entirely restructured, with Texas operating its own, isolated electricity market beyond the reach of federal regulation. In PJM and other RTOs, electricity is (more or less) sold competitively on a wholesale basis by independent power producers to utilities, who then serve customers.

But the system as it’s constructed now may, critics argue, expose retail customers to unacceptable cost increases — and greenhouse gas emissions — as it attempts to grapple with serving new data center load.

Klass and Owen, for their part, point to other markets as models for how electricity could work that don’t involve the same assumptions of plentiful supply that electricity markets historically have, such as those governing natural gas or even Western water rights.

Interruptions of natural gas service became more common starting in the 1970s, when some natural gas services were underpriced thanks to price caps, leading to an imbalance between supply and demand. In response, regulators “established a national policy of curtailment based on end use,” Klass and Owen write, with residential users getting priority “because of their essential heating needs, followed by firm industrial and commercial customers, and finally, interruptible customers.” Natural gas was deregulated in the late 1970s and 1980s, with curtailment becoming more market-based, which also allowed natural gas customers to trade capacity with each other.

Western water rights, meanwhile, are notoriously opaque and contested — but, importantly, they are based on scarcity, and thus may provide lessons in an era of limited electricity supply. The “prior appropriation” system water markets use is, “at its core, a set of mechanisms for allocating shortage,” the authors write. Water users have “senior” and “junior” rights, with senior users “entitled to have their rights fulfilled before the holders of newer, or more ’junior,’ water rights.” These rights can be transferred, and junior users have found ways to work with what water they can get, with the authors citing extensive conservation efforts in Southern California compared to the San Francisco Bay area, which tends to have more senior rights.

With these models in mind, Klass and Owen propose a system called “demand side connect-and-manage,” whereby new loads would not necessarily get transmission and generation service at all times, and where utilities could curtail users and electricity customers would have the ability “to use trading to hedge against the risk of curtailments.”

“We can connect you now before we build a whole lot of new generation, but when we need to, we’re going to curtail you,” Klass said, describing her and Owen’s proposal.

Tyler Norris, a Duke University researcher who has published concept-defining work on data center flexibility, called the paper “one of the most important contributions yet toward the re-examination of basic assumptions of U.S. electricity law that’s urgently needed as hyperscale load growth pushes our existing regulatory system beyond its limits.”

While electricity may not be literally drying up, he told me, “when you are supply side constrained while demand is growing, you have this challenge of, how do you allocate scarcity?”

Unlike the PJM proposals, “Our paper was very focused on state law,” Klass told me. “And that was intentional, because I think this is trickier at the federal level,” she told me.

Some states are already embracing similar ideas. Ohio regulators, for instance, established a data center tariff that tries to protect customers from higher costs by forcing data centers to make minimum payments regardless of their actual electricity use. Texas also passed a law that would allow for some curtailment of large loads and reforms of the interconnection process to avoid filling up the interconnection queue with speculative projects that could result in infrastructure costs but not real electricity demand.

Klass and Owen write that their idea may be more of “a temporary bridging strategy, primarily for periods when peak demand outstrips supply or at least threatens to do so.”

Even those who don’t think the principles underlying electricity markets need to be rethought see the need — at least in the short term — for new options for large new power users who may not get all the power they want all of the time.

“Some non-firm options are necessary in the short term,” Gramlich told me, referring to ideas like Klass and Owen’s, Norris’s, and PJM’s. “Some of them are going to have some legal infirmities and jurisdictional problems. But I think no matter what, we’re going to see some non-firm options. A lot of customers, a lot of these large loads, are very interested, even if it’s a temporary way to get connected while they try to get the firm service later.”

If electricity markets have worked for over one hundred years on the principle that more customers could bring down costs for everyone, going forward, we may have to get more choosy — or pay the price.

Green

You’re out of free articles.

Subscribe today to experience Heatmap’s expert analysis 
of climate change, clean energy, and sustainability.
To continue reading
Create a free account or sign in to unlock more free articles.
or
Please enter an email address
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Politics

How Republicans Are Trying to Gut the Endangered Species Act

The 50-year-old law narrowly avoided evisceration on the House floor Wednesday, but more threats lie in wait.

Endangered species.
Heatmap Illustration/Getty Images

Americans may not agree on much, but it seems fair to say that most are pretty happy that the bald eagle isn’t extinct. When the Senate passed the Endangered Species Act on a 92-0 vote in 1973, bald eagles were among the first on the protected list, their population having cratered to fewer than 450 nesting pairs by the early 1960s. Now delisted, bald eagles easily outnumber the population of St. Louis, Missouri, in 2026, at more than 300,000 individuals.

The Endangered Species Act remains enduringly popular more than 50 years later due to such success stories, with researchers finding in a 2018 survey that support for the legislation has “remained stable over the past two decades,” with only about one in 10 Americans opposing it. Even so, the law has long been controversial among industry groups because of the restrictions it imposes on development. In 2011, when Republicans took control of the House of Representatives, Congress introduced 30 bills to alter the ESA, then averaged around 40 per year through 2016.

Keep reading...Show less
Green
Climate Tech

Exclusive: Octopus Energy Launches Battery-Powered Electricity Plan With Lunar

The companies are offering Texas ratepayers a three-year fixed-price contract that comes with participation in a virtual power plant.

Octopus and Lunar Energy.
Heatmap Illustration/Getty Images

Customers get a whole lot of choice in Texas’ deregulated electricity market — which provider to go with, fixed-rate or variable-rate plan, and contract length are all variables to consider. If a customer wants a home battery as well, that’s yet another exercise in complexity, involving coordination with the utility, installers, and contractors.

On Wednesday, residential battery manufacturer and virtual power plant provider Lunar Energy and U.K.-based retail electricity provider Octopus Energy announced a partnership to simplify all this. They plan to offer Texas electricity ratepayers a single package: a three-year fixed-rate contract, a 30-kilowatt-hour battery, and automatic participation in a statewide network of distributed energy resources, better known as a virtual power plant, or VPP.

Keep reading...Show less
Blue
AM Briefing

Blowing the Whistle

On Trump’s renewables embargo, Project Vault, and perovskite solar

Pollution.
Heatmap Illustration/Getty Images

Current conditions: Illinois far outpaces every other state for tornadoes so far this year, clocking 80, with Mississippi in a distant second with 43 • Western North Carolina’s Blue Ridge Mountains face high wildfire risk during the day and frost at night • A magnitude 7.4 earthquake off the coast of Honshu, Japan, has raised the risk of a tsunami.

THE TOP FIVE

1. Whistleblowers allege big problems with corporate carbon standards-setter

The nonprofit that sets the standards against which tens of thousands of companies worldwide measure their greenhouse gas emissions is secretive and ideologically tilted toward industry. That’s the conclusion of a new whistleblower report on which Heatmap’s Emily Pontecorvo got her hands yesterday. The problems at the Greenhouse Gas Protocol “are systemic,” and the nonprofit “seems to be moving further away from its commitment to accountability,” the report said. Danny Cullenward, the economist and lawyer focused on scientific integrity in climate science at the University of Pennsylvania’s Kleinman Center for Energy Policy who authored the report, sits on the Protocol’s Independent Standards Board. Due to a restrictive non-disclosure agreement preventing him from talking about what he has witnessed, he instead relied on publicly available information to illustrate the report. “Not only does the nonprofit community not have a voice on the board,” Cullenward wrote, but the absence of those voices “risks politicizing the work of scientist Board members.” Emily added: “While the Protocol’s official decision-making hierarchy deems scientific integrity as its top priority, in practice, scientists are left to defend the science to the business community.” The report follows a years-long process meant to bolster the group’s scientific credibility. “Critics have long faulted the Protocol for allowing companies to look far better on paper than they do to the atmosphere,” Emily explains. But creating standards that are both scientifically robust and feasible to implement is no easy feat.

Keep reading...Show less
Red