You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
Why Microsoft, Talen Energy, the Data Center Coalition, and everyone else who objected to PJM’s proposal kinda has a point.

You could be mistaken for thinking data center load flexibility was the wave of the future.
With electricity prices rising — in some cases directly due to substantial new investments to support data centers — and data center developers desperate for power, there has seemed to be a new consensus forming around a way to solve both problems using the existing grid, simply by asking data centers to ramp down their energy use at times of peak demand. The whole thing looks like a win-win-win. Researchers have argued that even relatively low levels of curtailment could make room for almost 100 gigawatts of new load to the grid. Goldman Sachs released a report praising data center flexibility, and Google even negotiated a contract to enable flexibility with a Midwestern utility.
So everyone is on board with curtailment, right?
Well, no, at least not in the largest electricity market in the country — and the one that has become the poster child for backlash to data center development.
PJM Interconnection, the 13-state electricity market that spans the Mid-Atlantic and Midwest, has a data center problem. Costs associated with data centers ballooned to over $9 billion in its latest capacity auction, where generators get paid for their ability to stay online, a 174% increase, according to PJM’s independent market monitor.
The system operator has been working on a process to try to balance getting data centers online without risking the reliability of the grid, and in August unveiled an outline for so-called “Non-Capacity-Backed Load,” describing how new large loads like data centers could have their power curtailed.
“PJM expects that there will be a transitional period where NCBL will be necessary as a result of the significant integration of large loads,” the presentation read. “Participation would ideally be voluntary,” but new loads could be assigned NCBL status “on a mandatory basis if needed.” In other words, new data centers could, under the proposal, be essentially forced to shut down from time to time.
PJM then asked for feedback from its stakeholders. What it got wasn’t positive.
The proposal “clearly intrudes upon state jurisdiction and exceeds the Commission’s authority,” a representative from Microsoft said in a public comment on the proposal. Not only that, it would “fundamentally undercut the very purpose of PJM’s capacity market.” In the end, “the proposed rule won’t solve the problem.”
Multiply that sentiment across nearly 200 pages and imagine it coming from nearly every large company involved in the generation, transmission, and consumption of electricity in one of the most populous markets in the U.S. and you’ll begin to understand just how not positive the reaction truly was.
Several commenters, including data center developers, focused on singling out particular large loads for special treatment, which they argued ran afoul of what regional transmission organizations like PJM are allowed to do in structuring electricity markets. The Data Center Coalition, a trade group of datacenter developers, said that PJM “has not provided a defensible rationale for creating this new class of service, and on its face the proposal is unduly discriminatory.”
Like several other stakeholders, the DCC questioned whether PJM was the right actor to create new classes of rates, arguing that type of action “fall[s] squarely within state jurisdiction.” Talen Energy, an independent power company with a significant PJM footprint, also said that the proposal “lies outside of [PJM’s] power to impose.”
Talen, like other power producers, would benefit from a more traditional RTO process, whereby new load induces new demand for energy and capacity, which it could meet (for a price).
“Instead of discriminating against a single form of demand, PJM should focus on improving load forecasting and a market-based solution that encourages more generation supply to be built so that the ‘golden age for American manufacturing and technological dominance’ can be achieved,” the company wrote in its submission.
Even Tyler Norris, the Duke University researcher who has done some of the most widely cited and influential work on data center flexibility, critiqued the proposal, writing on X that there was “much room for improvement” and that it didn’t offer any “defined speed-to-power benefit” for data centers by participating.
The backlash from data center developers shouldn’t be surprising, explained Abraham Silverman, a former lawyer for the New Jersey Board of Public Utilities and an assistant research scholar at Johns Hopkins. “The existing rules are financially very favorable to the data centers. And the reason for that is because both transmission and generation costs are being spread over every customer in the PJM footprint.”
Traditionally, the infrastructure costs of bringing on new load are spread across all customers as a fixed cost, with the idea being that with more customers, over time the fixed costs of the grid go down on a per-customer basis. To the developers and other commenters on PJM’s proposal, this is just how electricity markets and utilities work. Generators and transmission owners don’t ask what the power is being used for, they just supply it. If more generation needs to come online to make sure they can meet that supply, that can happen through the capacity market, where utilities pay generators to be available when demand rises.
But that system may be breaking down as new data centers impose large upfront costs on the whole system that then show up in huge rate increases paid by everyone — to the tune of about 25% in transmission costs for PJM customers since 2020, according to Silverman’s research. That new load must receive reliable service, leading to a bonanza for existing and potential new generators, who can collect growing capacity payments.
“PJM recognizes that it’s between a rock and a hard place, where it potentially has more load coming onto its system than it could reliably serve,” Silverman told me. “They are recognizing they need to have a plan for rationing and allocating available capacity on the electric grid.”
PJM itself may be at risk if data center development leads to higher costs, its independent market monitor argued in a memo: “It is not an overstatement to assert that the ongoing addition of large data center loads will put PJM competitive markets at risk unless there is a solution that requires large data center loads to pay for the costs that they would otherwise impose on other customers.”
While the cranky commenters’ arguments may seem pretextual, or at least self-interested, they aren’t entirely off base, Silverman told me.
“I think there is both a legal and a moral problem here,” Silverman explained. “The moral problem is pretty clear cut: I don’t think anybody really thinks that grandma should be paying higher electric rates because of big tech data centers. The legal question is a little bit harder to answer, and I do think there are legitimate issues on both sides.”
Many of the stakeholder complaints center around the idea that treating large loads or data centers differently is discriminatory in a way that runs afoul of federal energy law. But just because the states may have to get involved in order to put data centers in a special class of electricity customer doesn’t mean that the substantive issues aren’t real.
Some states and regional transmission organizations have started to address the effects of data centers on other users of the grid, most notably Texas, which recently passed a law setting up a mandatory curtailment program for large loads, plus a voluntary demand response program, while Ohio utility AEP reached a deal to make sure data center developers cover the cost of new infrastructure by establishing minimum monthly payments.
PJM will hold another meeting on the proposal later this month and aims to have a proposal ready to present to the Federal Energy Regulatory Commission by the end of the year, although some stakeholders cast doubt on whether PJM could get its act together in time to put forward something to FERC by the end of the year. The Data Center Coalition argued in its comments that the current schedule “does not realistically permit” the “level of deliberation and shareholder vetting” necessary.
But even if the developers, transmission owners, and generators are able to push off this plan, however, the conflicts around data center expansion, reliability, and high electricity prices won’t go away.
“At what point do we seriously as a society talk about the trade-offs?” Silverman asked. “I think there are a lot of people who are financially incented to push off that tough conversation.”
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Current conditions: A broad swath of the United States stretching from South Texas to Chicago is being bombarded by the Central U.S. with severe storms and more than two dozen tornadoes so far • The thunderstorms pummeling Puerto Rico and the U.S. Virgin Islands are expected to stretch into the weekend • Kigali is also in the midst of a days-long stretch of heavy storms, testing the Rwandan capital’s recent wetland overhaul.
SunZia Wind, the largest renewable energy project of its kind ever built in the U.S., has started generating electricity, nearly capping off a two-decade effort to supply Californians with wind power generated in New Mexico. The developer has begun testing the project’s 916 turbines ahead of planned full-scale commercial operations later this quarter, unnamed sources told E&E News. The project includes 3.5 gigawatts of wind and 550 miles of transmission line to funnel the electricity west from the desert state to the coast. “The impact is already evident,” the newswire wrote. “California broke its record for wind generation eight times in the last four weeks.”
When Heatmap’s Robinson Meyer visited SunZia’s construction site in August 2024, he observed that, once it started running at full blast, the project would “generate roughly 1% of the country’s electricity needs.” Its success in the face of the Trump administration’s attacks on wind could “lay the model” for a new paradigm in which “clean energy buildings and environmental protectors work together to find the best solution for the environment and the climate,” Rob wrote. “We will need many more success stories like it if America is to meet its climate goals — 99 more, to be exact.”
The U.S. Senate voted 50-49 on Thursday to repeal a mining ban on land near the Minnesota’s Boundary Waters Canoe Area Wilderness, declaring what Heatmap’s Jeva Lange called “open season” on public lands. In what the public lands news site Public Domain called “an unprecedented use of the Congressional Review Act,” the vote slashes protections for the iconic nature preserve. Inspiring even fiercer political pushback is the fact that Republicans championed the effort largely to benefit an overseas corporation: Twin Metals Minnesota, a subsidiary of the Chilean mining conglomerate Antofagasta, which has for years sought to establish a copper-nickel mine on national forest land near the wilderness area. “The Boundary Waters belong to everyone,” Julie Goodwin, a senior attorney at Earthjustice, said in a statement. “They should be protected and enjoyed by all, not jeopardized to benefit a wealthy foreign company.”
At the same time, global demand for both nickel and copper are surging — and a successful effort to decarbonize the world economy through greater electrification will require a lot more of both metals.
Get Heatmap AM directly in your inbox every morning:
The good news: The Department of Energy is allowing the Direct Air Capture hub program started under the Biden administration to move forward. In documents submitted to Congress this week, the agency listed as approved the up to $1.2 billion the program awarded to two projects: Occidental Petroleum’s South Texas DAC Hub, and Climeworks and Heirloom’s joint Project Cypress in Louisiana. As Heatmap’s Emily Pontecorvo noted: “This fate was far from certain.” After the Energy Department cut funding for 10 of the original 21 projects last fall, a leaked list of projects suggested the Louisiana and Texas hubs would be targeted in a second wave of rescissions. The bad news: Last week, Rob had a scoop that Microsoft — whose carbon removal buying made up roughly 80% of the industry — was pausing its purchases. And as he wrote yesterday, even if it’s just temporary, the pause will ripple through the nascent market.
Other technologies that once seemed like science fiction are, in fact, moving forward. In an exclusive for Heatmap, I reported that Clean Core Thorium Energy, a Chicago-based company designing thorium fuel bundles that works in existing reactors, inked a deal to manufacture its first four units. In addition to assembling the bundles, the Canadian National Laboratories will supply the small amount of a special kind of uranium fuel needed to be blended into Clean Core’s mix and that serves as a spark plug for the reaction.
Sign up to receive Heatmap AM in your inbox every morning:
Last October, the Energy Department asked the Federal Energy Regulatory Commission to set rules for patching data centers, advanced factories, and other large loads onto the grid. The move, as Utility Dive reported at the time, sparked controversy over whether it represented a Washington power grab given that the landmark Federal Power Act gives states jurisdiction over retail electricity interconnections. Now FERC has said it plans to respond. On Thursday, Robin Millican, a researcher at Columbia University’s Center on Global Energy Policy, posted on X that FERC announced a notice of intent to act on the Energy Department’s request, with a ruling expected in June. “Good,” she wrote. “Ensuring interconnection costs from data centers, advanced manufacturing, and big electrification projects aren’t passed to retail customers is overdue.”
Back in January, I told you that two geothermal startups raised a combined $212 million: Zanskar, which uses artificial intelligence to hunt down previously undetected conventional geothermal resources underground; and Sage Geosystems, a next-generation startup using fracking technology to drill for geothermal heat in places that conventional resources can’t tap. This week we saw two geothermal companies once again net a nine-digit number. Once again, Zankskar — considered by experts Heatmap surveyed to be one of the most promising climate-tech companies in the game right now for a reason, after all — announced the closing of another $40 million fundraise. Just Capital and Spring Lane Capital led the round, with an additional investment from Tierra Adentro Growth Capital. Zanskar said the round was a development capital facility, a type of deal that usually involves equity or debt to fund a company’s growth. It is “among the first ever structured for early-stage geothermal development, drawing on the best practices from the renewables and natural resource sectors,” the company said Thursday in a press release. The financing will help establish a revolving line of credit “designed to accelerate project development.”
On Wednesday, another competitor in the next-generation geothermal space, Mazama Energy, pulled in a fresh round of capital. The Frisco, Texas-based company, which last year boasted a system that reached hotter temperatures than any other geothermal company, just raised $100 million, according to Axios.

San Diego, once the poster child for a drought-parched Southern Californian city, is now looking to become a water exporter, The Wall Street Journal reported. North America’s largest desalination plant is producing so much freshwater for the San Diego County Water Authority that the city is working on a deal to sell millions of gallons to Arizona and Nevada. The Claude “Bud” Lewis Carlsbad Desalination Plant, which opened in 2015 and is owned by an infrastructure investment firm, may produce more expensive than average water, but “it is important to note that it is more reliable than other sources,” Keith R. Solar, a water attorney from the seaside neighborhood of Point Loma, wrote in the Voice of San Diego last year. “Its value as insurance against disruption of supplies from other sources makes it a critical part of our future.”
Though the tech giant did not say its purchasing pause is permanent, the change will have lasting ripple effects.
What does an industry do when it’s lost 80% of its annual demand?
The carbon removal business is trying to figure that out.
For the past few years, Microsoft has been the buyer of first and last resort for any company that sought to pull carbon dioxide from the atmosphere. In order to achieve an aggressive internal climate goal, the software company purchased more than 70 million metric tons of carbon removal credits, 40 times more than anyone else.
Now, it’s pulling back. Microsoft has informed suppliers and partners that it is pausing carbon removal buying, Heatmap reported last week. Bloomberg and Carbon Herald soon followed. The news has rippled through the nascent industry, convincing executives and investors that lean years may be on the way after a period of rapid growth.
“For a lot of these companies, their business model was, ‘And then Microsoft buys,’” said Julio Friedmann, the chief scientist at Carbon Direct, a company that advises and consults with companies — including, yes, Microsoft — on their carbon management projects, in an interview. “It changes their business model significantly if Microsoft does not buy.”
Microsoft told me this week that it has not ended the purchasing program. It still aims to become carbon negative by 2030, meaning that it must remove more climate pollution from the atmosphere than it produces in that year, according to its website. Its ultimate goal is to eliminate all 45 years of its historic carbon emissions from electricity use by 2050.
“At times, we may adjust the pace or volume of our carbon removal procurement as we continue to refine our approach toward sustainability goals,” Melanie Nakagawa, Microsoft’s chief sustainability officer, said in a statement. “Any adjustments we make are part of our disciplined approach — not a change in ambition.”
Yet even a partial pullback will alter the industry. Over the past five years, carbon removal companies have raised more than $3.6 billion, according to the independent data tracker CDR.fyi. Startups have invested that money into research and equipment, expecting that voluntary corporate buyers — and, eventually, governments — will pay to clean up carbon dioxide in the air.
Although many companies have implicitly promised to buy carbon removal credits — they’re all but implied in any commitment to “net zero” — nobody bought more than Microsoft. The software company purchased 45 million tons of carbon removal last year alone, according to its own data.
The next biggest buyer of carbon removal credits — Frontier, a coalition of large companies led by the payments processing firm Stripe — has bought 1.8 million tons total since launching in 2022.
With such an outsize footprint, Microsoft’s carbon removal team became the de facto regulator for the early industry — setting prices, analyzing projects, and publishing in-house standards for public consumption.
It bought from virtually every kind of carbon removal company, purchasing from large-scale, factory-style facilities that use industrial equipment to suck carbon from the air, as well as smaller and more natural solutions that rely on photosynthesis. One of its largest deals was with the city-owned utility for Stockholm, Sweden, which is building a facility to capture the carbon released when plant matter is burned for energy.
That it would some day stop buying shouldn’t be seen as a surprise, Hannah Bebbington, the head of deployment at the carbon-removal purchasing coalition Frontier, told me. “It will be inevitable for any corporate buyer in the space,” she said. “Corporate budgets are finite.”
Frontier’s members include Google, McKinsey, and Shopify. The coalition remains “open for business,” she said. “We are always open to new buyers joining Frontier.”
But Frontier — and, certainly, Microsoft — understands that the real point of voluntary purchasing programs is to prime the pump for government policy. That’s both because governments play a central role in spurring along new technologies — and because, when you get down to it, governments already handle disposal for a number of different kinds of waste, and carbon dioxide in the air is just another kind of waste. (On a per ton basis, carbon removal may already be price-competitive with municipal trash pickup.)
“The end game here is government support in the long-term period,” Bebbington said. “We will need a robust set of policies around the world that provide permanent demand for high-quality, durable CDR funds.”
“The voluntary market plays a critical role right now, but it won’t scale, and we don’t expect it will scale to the size of the problem,” she added.
Only a handful of companies had the size and scale to sell carbon credits to Microsoft, which tended to place orders in the millions of tons, Jack Andreasen Cavanaugh, a researcher at the Center on Global Energy Policy at Columbia University, told me on a recent episode of Heatmap’s podcast, Shift Key. Those companies will now be competing with fledgling firms for a market that’s 80% smaller than it used to be.
“Fundamentally, what it will mean is just an acceleration of something that was going to happen anyway, which is consolidation and bankruptcies or dissolutions,” Cavanaugh told me. “This was always going to happen at this moment because we don’t have supportive policy.”
Friedmann agreed with the dour outlook. “We will see the best companies and the best projects make it. But a lot of companies will fail, and a lot of projects will fail,” he told me.
To some degree, Microsoft planned for that eventuality in its purchase scheme. The company signed long-term offtake contracts with companies to “pay on delivery,” meaning that it will only pay once tons are actually shown to be durably dealt with. That arrangement will protect Microsoft’s shareholders if companies or technologies fail, but means that it could conceivably keep paying out carbon removal firms for the next 10 years, Noah Deich, a former Biden administration energy official, told me.
The pause, in other words, spells an end to new dealmaking, but it does not stop the flow of revenue to carbon removal companies that have already signed contracts with Microsoft. “The big question now is not who will the next buyer be in 2026,”’ Deich said. “It is who is actually going to deliver credits and do so at scale, at cost, and on time.”
Deich, who ran the Energy Department’s carbon management programs, added that Microsoft has been as important to building the carbon removal industry as Germany was to creating the modern solar industry. That country’s feed-in tariff, which started in 2000, is credited with driving so much demand for solar panels that it spurred a worldwide wave of factory construction and manufacturing innovation.
“The idea that a software company could single-handedly make the market for a climate technology makes about as much sense as the country of Germany — with the same annual solar insolation as Alaska — making the market for solar photovoltaic panels,” Deich said, referencing the comparatively low amount of sunlight that it receives. “But they did it. Climate policy seems to defy Occam’s razor a lot, and this is a great example of that.”
History also shows what could happen if the government fails to step up. In the 1980s, the U.S. government — which had up to that point been the world’s No. 1 developer of solar panel technology — ended its advance purchase program. Many American solar firms sold their patents and intellectual property to Japanese companies.
Those sales led to something of a lost decade for solar research worldwide and ultimately paved the way for East Asian manufacturing companies — first in Japan, and then in China — to dominate the solar trade, Deich said. If the U.S. government doesn’t step up soon, then the same thing could happen to carbon removal.
The climate math still relied upon by global governments to guide their national emissions targets assumes that carbon removal technology will exist and be able to scale rapidly in the future. The Intergovernmental Panel on Climate Change says that many outcomes where the world holds global temperatures to 1.5 or 2 degrees Celsius by the end of the century will involve some degree of “overshoot,” where carbon removal is used to remove excess carbon from the atmosphere.
By one estimate, the world will need to remove 7 billion to 9 billion tons of carbon from the atmosphere by the middle of the century in order to hold to Paris Agreement goals. You could argue that any scenario where the world meets “net zero” will require some amount of carbon removal because the word “net” implies humanity will be cleaning up residual emissions with technology. (Climate analysts sometimes distinguish “net zero” pathways from the even-more-difficult “real zero” pathway for this reason.)
Whether humanity has the technologies that it needs to eliminate emissions then will depend on what governments do now, Deich said. After all, the 2050s are closer to today than the 1980s are.
“It’s up to policymakers whether they want to make the relatively tiny investments in technology that make sure we can have net-zero 2050 and not net-zero 2080,” Deich said.
Congress has historically supported carbon removal more than other climate-critical technologies. The bipartisan infrastructure law of 2022 funded a new network of industrial hubs specializing in direct air capture technology, and previous budget bills created new first-of-a-kind purchasing programs for carbon removal credits. Even the Republican-authored One Big Beautiful Bill Act preserved tax incentives for some carbon removal technologies.
But the Trump administration has been far more equivocal about those programs. The Department of Energy initially declined to spend some funds authorized for carbon removal schemes, and in some cases redirected the funds — potentially illegally — to other purposes. (Carbon removal advocates got good news on Wednesday when the Energy Department reinstated $1.2 billion in grants to the direct air capture hubs.)
Those freezes and reallocations fit into the Trump administration’s broader war on federal climate policy. In part, Trump officials have seemed reluctant to signal that carbon might be a public problem — or something that needs to be “removed” or “managed” — in the first place.
Other countries have started preliminary carbon management programs — Norway, the United Kingdom, and Canada — have launched pilots in recent years. The European carbon market will also soon publish rules guiding how carbon removal credits can be used to offset pollution.
But in the absence of a large-scale federal program in the U.S., lean years are likely coming, observers said.
“I am optimistic that [carbon removal] will continue to scale, but not like it was,” Friedmann said. “Microsoft is a symptom of something that was coming.”
“The need for carbon removal has not changed,” he added.
What happens when one of energy’s oldest bottlenecks meets its newest demand driver?
Often the biggest impediment to building renewable energy projects or data center infrastructure isn’t getting government approvals, it’s overcoming local opposition. When it comes to the transmission that connects energy to the grid, however, companies and politicians of all stripes are used to being most concerned about those at the top – the politicians and regulators at every level who can’t seem to get their acts together.
What will happen when the fiery fights on each end of the wire meet the broken, unplanned spaghetti monster of grid development our country struggles with today? Nothing great.
The transmission fights of the data center boom have only just begun. Utilities will have to spend lots of money on getting energy from Point A to Point B – at least $500 billion over the next five years, to be precise. That’s according to a survey of earnings information published by think tank Power Lines on Tuesday, which found roughly half of all utility infrastructure spending will go toward the grid.
But big wires aren’t very popular. When Heatmap polled various types of energy projects last September, we found that self-identified Democrats and Republicans were mostly neutral on large-scale power lines. Independent voters, though? Transmission was their second least preferred technology, ranking below only coal power.
Making matters far more complex, grid planning is spread out across decision-makers. At the regional level, governance is split into 10 areas overseen by regional transmission organizations, known as RTOs, or independent system operators, known as ISOs. RTOs and ISOs plan transmission projects, often proposing infrastructure to keep the grid resilient and functional. These bodies are also tasked with planning the future of their own grids, or at least they are supposed to – many observers have decried RTOs and ISOs as outmoded and slow to respond. Utilities and electricity co-ops also do this planning at various scales. And each of these bodies must navigate federal regulators and permitting processes, utility commissions for each state they touch, on top of the usual raft of local authorities.
The mid-Atlantic region is overseen by PJM Interconnection, a body now under pressure from state governors in the territory to ensure the data center boom doesn’t unnecessarily drive up costs for consumers. The irony, though, is that these governors are going to be under incredible pressure to have their states act against individual transmission projects in ways that will eventually undercut affordability.
Virginia, for instance – known now as Data Center Alley – is flanked by states that are politically diverse. West Virginia is now a Republican stronghold, but was long a Democratic bastion. Maryland had a Republican governor only a few years ago. Virginia and Pennsylvania regularly change party control. These dynamics are among the many drivers behind the opposition against the Piedmont Reliability Project, which would run from a nuclear plant in Pennsylvania to northern Virginia, cutting across spans of Maryland farmland ripe for land use conflict. The timeline for this project is currently unclear due to administrative delays.
Another major fight is brewing with NextEra’s Mid-Atlantic Resiliency Link, or MARL project. Spanning four states – and therefore four utility commissions – the MARL was approved by PJM Interconnection to meet rising electricity demand across West Virginia, Virginia, Maryland and Pennsylvania. It still requires approval from each state utility commission, however. Potentially affected residents in West Virginia are hopping mad about the project, and state Democratic lawmakers are urging the utility commission to reject it.
In West Virginia, as well as Virginia and Maryland, NextEra has applied for a certificate of public convenience and necessity to build the MARL project, a permit that opponents have claimed would grant it the authority to exercise eminent domain. (NextEra has said it will do what it can to work well with landowners. The company did not respond to a request for comment.)
“The biggest problem facing transmission is that there’s so many problems facing transmission,” said Liza Reed, director of climate and energy at the Niskanen Center, a policy think tank. “You have multiple layers of approval you have to go through for a line that is going to provide broader benefits in reliability and resilience across the system.”
Hyperlocal fracases certainly do matter. Reed explained to me that “often folks who are approving the line at the state or local level are looking at the benefits they’re receiving – and that’s one of the barriers transmission can have.” That is, when one state utility commission looks at a power line project, they’re essentially forced to evaluate the costs and benefits from just a portion of it.
She pointed to the example of a Transource line proposed by PJM almost 10 years ago to send excess capacity from Pennsylvania to Maryland. It wasn’t delayed by protests over the line itself – the Pennsylvania Public Utilities Commission opposed the project because it thought the result would be net higher electricity bills for folks in the Keystone State. That’s despite whatever benefits would come from selling the electricity to Maryland and consumer benefits for their southern neighbors. The lesson: Whoever feels they’re getting the raw end of the line will likely try to stop it, and there’s little to nothing anyone else can do to stop them.
These hyperlocal fears about projects with broader regional benefits can be easy targets for conservation-focused environmental advocates. Not only could they take your land, the argument goes, they’re also branching out to states with dirtier forms of energy that could pollute your air.
“We do need more energy infrastructure to move renewable energy,” said Julie Bolthouse, director of land use for the Virginia conservation group Piedmont Environmental Council, after I asked her why she’s opposing lots of the transmission in Virginia. “This is pulling away from that investment. This is eating up all of our utility funding. All of our money is going to these massive transmission lines to give this incredible amount of power to data centers in Virginia when it could be used to invest in solar, to invest in transmission for renewables we can use. Instead it’s delivering gas and coal from West Virginia and the Ohio River Valley.”
Daniel Palken of Arnold Ventures, who previously worked on major pieces of transmission reform legislation in the U.S. Senate, said when asked if local opposition was a bigger problem than macro permitting issues: “I do not think local opposition is the main thing holding up transmission.”
But then he texted me to clarify. “What’s unique about transmission is that in order for local opposition to even matter, there has to be a functional planning process that gets transmission lines to the starting line. And right now, only about half the country has functional regional planning, and none of the country has functional interregional planning.”
It’s challenging to fathom a solution to such a fragmented, nauseating puzzle. One solution could be in Congress, where climate hawks and transmission reform champions want to empower the Federal Energy Regulatory Commission to have primacy over transmission line approvals, as it has over gas pipelines. This would at the very least contain any conflicts over transmission lines to one deciding body.
“It’s an old saw: Depending on the issue, I’ll tell you that I’m supportive of states’ rights,” Representative Sean Casten told me last December. “[I]t makes no sense that if you want to build a gas pipeline across multiple states in the U.S., you go to FERC and they are the sole permitting authority and they decide whether or not you get a permit. If you go to the same corridor and build an electric transmission that has less to worry about because there’s no chance of leaks, you have a different permitting body every time you cross a state line.”
Another solution could come from the tech sector thinking fast on its feet. Google for example is investing in “advanced” transmission projects like reconductoring, which the company says will allow it to increase the capacity of existing power lines. Microsoft is also experimenting with smaller superconductor lines they claim deliver the same amount of power than traditional wires.
But this space is evolving and in its infancy. “Getting into the business of transmission development is very complicated and takes a lot of time. That’s why we’ve seen data centers trying a lot of different tactics,” Reed said. “I think there’s a lot of interest, but turning that into specific projects and solutions is still to come. I think it’s also made harder by how highly local these decisions are.”