You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
New rules governing how companies report their scope 2 emissions have pit tech giant against tech giant and scholars against each other.

All summer, as the repeal of wind and solar tax credits and the surging power demands of data centers captured the spotlight, a more obscure but equally significant clean energy fight was unfolding in the background. Sustainability executives, academics, and carbon accounting experts have been sparring for months over how businesses should measure their electricity emissions.
The outcome could be just as consequential for shaping renewable energy markets and cleaning up the power grid as the aforementioned subsidies — perhaps even more so because those subsidies are going away. It will influence where and how — and potentially even whether — companies continue to voluntarily invest in clean energy. It has pitted tech heavyweights like Google and Microsoft against peers Meta and Amazon, all of which are racing each other to power their artificial intelligence operations without abandoning their sustainability commitments. And it could affect the pace of emissions reductions for decades to come.
In essence, the fight is over how to appraise the climate benefits of companies’ clean power purchases. The arena is the Greenhouse Gas Protocol, a nonprofit that creates voluntary emissions reporting standards. Companies use these standards to calculate emissions from their direct operations, from the electricity and gas that powers and heats their buildings, and from their supply chains. If you’ve ever seen a brand claim it “runs on 100% renewable energy,” that statement is likely backed by a Greenhouse Gas Protocol-sanctioned methodology.
For years, however, critics have poked holes in the group’s accounting rules and assumptions, charging it with enabling greenwashing. In response, the organization has decided to overhaul its standards, including for how companies should measure their electricity footprint, known as “scope 2” emissions.
The Greenhouse Gas Protocol first convened a technical working group to revise its Scope 2 Standard last September. By late June, the group had finalized a draft proposal with more rigorous criteria for clean energy claims, despite intense pushback on the underlying direction from companies and clean energy groups.
A flurry of op-eds, essays, and LinkedIn posts accused the working group of being on the “wrong track,” and called the proposal a “disaster” with “unintended consequences.” The Clean Energy Buyers Association, a trade group, penned a letter saying it was “inefficient and infeasible for most buyers and may curtail ambitious global climate action.” Similarly, the American Council on Renewable Energy warned that the plan “could unintentionally chill investment and growth in the clean energy sector.”
Next the draft will face a 60-day public consultation period that begins in early October. “There’ll be pushback from every direction,” Matthew Brander, a professor of carbon accounting at the University of Edinburgh and a member of the Scope 2 Working Group, told me. Ultimately, it will be up to the Working Group, the Protocol’s Independent Standards Board, and its Steering Committee, to decide whether the proposal will be adopted or significantly revised.
The challenge of creating a defensible standard begins with the fundamental physics of electricity. On the power grid, electrons from coal- and natural gas-fired power plants intermingle with those from wind and solar farms. There’s no way for companies hooking up to the grid to choose which electrons get delivered to their doors or opt out of certain resources. So if they want to reduce their carbon footprints, they can either decrease their energy consumption — by making their operations more efficient, say, or installing on-site solar panels — or they can turn to financial instruments such as renewable energy certificates, or RECs.
In general, a REC certifies that one megawatt-hour of clean power was generated, at some point, somewhere. The current Scope 2 Standard treats all RECs as interchangeable, but in reality, some RECs are far more effective than others at reducing emissions. The question now is how to improve the standard to account for these differences.
“There is no absolute truth,” Wilson Ricks, an engineering postdoctoral researcher at Princeton University and working group member, told me back in June. “I mean, there are more or less absolute truths about things like how much emissions are going into the atmosphere. But the system for how companies report a certain number, and what they’re able to claim about that number, is ultimately up to us.”
The current standard, finalized in 2015, instructs companies to report two numbers for their scope 2 emissions, based on two different methodologies. The formula for the first is straightforward: multiply the amount of electricity your facilities consume in a given year by the average emissions produced by the local power grids where you operate. This “location-based” number is a decent approximation of the carbon emitted as a result of the company’s actual energy use.
If the company buys RECs or similar market-based instruments, it can also calculate its “market-based” emissions. Under the 2015 standard, if a company consumed 100 megawatt-hours in a year and bought 100 megawatt-hours’ worth of certificates from a solar farm, it could report that its scope 2 emissions, under the market-based method, were zero. This is what enables companies to claim they “run on 100% renewable energy.”
RECs are fundamentally different from carbon offsets, in that they do not certify that any specific amount of emissions has been prevented. They can cut carbon indirectly by creating an additional revenue stream for renewable energy projects. But when a company buys RECs from a solar project in California, where the grid is saturated with solar, it will do less to reduce emissions than if it bought RECs from a solar project in Wyoming, where the grid is still largely powered by coal, or from a battery storage project in California, which can produce clean power at night.
There are other ways RECs can vary — for instance, companies can buy them directly from power producers by means of a long-term contract, or as one-off purchases on the spot market. Spot market REC purchases are generally less effective at displacing fossil fuels because they’re more likely to come from pre-existing wind and solar farms — sometimes ones that have been operating for years and would continue with or without REC sales. Long-term contracts, by contrast, can help get new clean energy projects financed because the guaranteed revenue helps developers secure financing. (There are exceptions to these rules, but these are broadly the dynamics.)
All this is to say that the current standard allows for two companies that consumed the same amount of power and bought the same number of RECs to report that they have “zero emissions,” even if one helped reduce emissions by a lot and the other did little to nothing. Almost everyone agrees the situation can be improved. The question is how.
The proposal set for public comment next month introduces more granularity to the rules around RECs. Instead of tallying up annual aggregate energy use, companies would have to tally it up by hour and location. To lower companies' scope 2 footprints further, purchased RECs will have to be generated within the same grid region as the company’s operations, and match a distinct hour of consumption. (This “hourly matching” approach may sound familiar to anyone who followed the fight over the green hydrogen tax credit rules.)
Proponents see this as a way to make companies’ claims more credible — businesses would no longer be able to say they were using solar power at night, or wind power generated in Texas to supply a factory in Maine. While companies would still not be literally consuming the power from the RECs they buy, it would at least be theoretically possible that they could be. “It’s really, in my view, taking how we do electricity accounting back to some fundamentals of how the power system itself works,” Killian Daly, executive director of the nonprofit EnergyTag, which advocates for hourly matching, told me.
The granularity camp also argues that these rules create better incentives. Today, companies mostly buy solar RECs because they’re cheap and abundant. But solar alone can’t get us to zero emissions electricity, Ricks told me. Hourly matching will force companies to consider signing contracts with energy storage and geothermal projects, for example, or reducing their energy use during times when there’s less clean energy available. “It incentivizes the actions and investments in the technologies and business practices that will be needed to actually finish the job of decarbonizing grids,” he said.
While the standard is technically voluntary, companies that object to the revision will likely be stuck with it, as governments in California and Europe have started to integrate the Greenhouse Gas Protocol’s methodologies into their mandatory corporate disclosure rules.
The proposal’s critics, however, contend that time and location matching will be so costly and difficult to implement that it may lead companies to simply stop buying clean energy. One analysis by the electricity data science nonprofit WattTime found that the draft revision could increase emissions compared to the status quo if it causes a decline in corporate clean power procurement. “We’re looking at a potentially really catastrophic failure of the renewable energy market,” Gavin McCormick, the co-founder and executive director of WattTime, told me.
Another concern is that companies with operations in multiple regions could shift from signing long-term contracts for RECs, often called power purchase agreements, to relying on the spot market. These contracts must be large to be beneficial for developers because negotiating multiple offtake agreements for a single renewable energy project increases costs and risk. Such deals may still make sense for big energy users like data centers, but a company like Starbucks, with cafes throughout the country, will have to start sourcing fewer RECs in more places to cover all the parts of the world where they operate.
The granularity fans assert that their proposal will not be as challenging or expensive as critics claim — and regardless, they argue, real decarbonization is difficult. It should be hard for companies to make bold claims like saying they are 100% clean, Daly told me. “We need to get to a place where companies can be celebrated for being like, I’m not 100% matched, but I will be in five years,” he said.
The proposal does include carve-outs allowing smaller companies to continue to use annual matching and for legacy clean energy contracts, even if they don’t meet hourly or location requirements. But critics like McCormick argue that the whole point of revising the standard is to help catalyze greater emission reductions. Less participation in the market would hurt that goal — but more than that, these accounting rules aren’t designed to measure emissions, let alone maximize real-world emission reductions. You could still have one company that spends the time and money to invest in scarce resources at odd hours and achieves 60% clean power, while another achieves the same proportion by continuing to buy abundant solar RECs. Both would still get to claim the same sustainability laurels.
The biggest corporate defender of time and location matching is Google. On the other side are tech giants Meta and Amazon, among others, arguing for an approach more explicitly focused on emissions. They want the Greenhouse Gas Protocol to endorse a different accounting scheme that measures the fossil fuel emissions displaced by a given clean energy purchase and allows companies to subtract that amount from their total scope 2 footprint — much more akin to the way carbon offsets work.
If done right, this method would recognize the difference between a solar REC in California and one in Wyoming. It would give companies more flexibility, potentially deploying capital to less developed parts of the world that need help to decarbonize. It could also, eventually, encourage investment in less mature and therefore more expensive resources, like energy storage and geothermal — although perhaps not until there’s solar panels on every corner of the globe.
This idea, too, is risky. Calculating the real-world emissions impact of a REC, which the scope 2 working group calls “consequential accounting” is an exercise in counterfactuals. It requires making assumptions about what the world would have looked like if the REC hadn’t been purchased, both in the near term and long term. Would the clean energy have been generated anyway?
McCormick, who is a proponent of this emissions-focused approach, argues that it’s possible to measure the counterfactual in the electricity market with greater certainty than with something like forestry carbon offsets. With electricity, he told me, “there's five minute-level data for almost every power plant in the world, as opposed to forests. If you're lucky, you measure some forests, once a year. It's like a factor of 10,000 times more data, so all the models are more accurate.”
Some granularity proponents, including Ricks, agree that consequential accounting is valuable and could have a place in corporate reporting, but worry that it’s ripe for abuse. “At the end of the day, you can't ever verify whether the system you're using to assign a given company a given number is right, because you can't observe that counterfactual world,” he said. “We need to be very cautious about how it’s designed, and also how companies actually report what they’re doing and what level of confidence is communicated.”
Both proposals are flawed, and both have potential to allow at least some companies to claim progress on paper while having little real-world impact. In some ways, the disagreement is more philosophical than scientific. What should this standard be trying to achieve? Should it be steering corporate dollars into clean energy, accuracy of claims be damned? Or should it be protecting companies from accusations of greenwashing? What impacts do we care about more, faster emissions reductions or strategic decarbonization?
“They’re actually not opposing views,” McCormick told me. “There’s these people making this point and there’s these people making this point. They’re running into each other, but they’re actually not saying opposite things.”
To Michael Gillenwater, executive director of the Greenhouse Gas Management Institute, a carbon accounting research and training nonprofit, people are attempting to hide policy questions within the logic and principles of accounting. “We’re asking the emissions inventories to do too much — to do more than they can — and therefore we end up with a mess,” he told me. Corporate disclosures serve many different purposes — helping investors assess risk, informing a company’s internal target setting and performance tracking, creating transparency for consumers. “A corporate inventory might be one little piece of that puzzle,” he said.
Gillenwater is among those that think the working group’s time- and location-matching proposal would stifle corporate investment in clean energy when the goal should be to foster it. But his preferred solution is to forget trying to come up with a single metric and to encourage companies to make multiple disclosures. Companies could publish their location-based greenhouse gas inventory and then use market-based accounting to make a separate “mitigation intervention statement.” To sum it up, Gillenwater said, “keep the emissions inventory clean.”
The risk there is that the public — or indeed anyone not deeply versed in these nuances — will not understand the difference. That’s why Brander, the Edinburgh professor, argues that regardless of how it all shakes out, the Greenhouse Gas Protocol itself needs to provide more explicit guidance on what these numbers mean and how companies are allowed to talk about them.
“At the moment, the current proposals don’t include any text on how to interpret the numbers,” he said. “It’s almost incredible, really, for an accounting standard to say, here’s a number, but we’re not going to tell you how to interpret it. It’s really problematic.”
All this pushback may prompt changes. After the upcoming comment period closes in late November or early December, the working group could decide to revise the proposal and send it out for public consultation again. The entire revision process isn’t estimated to be completed until the end of 2027 at the earliest.
With wind and solar tax credits scheduled to sunset around then, voluntary action by companies will take on even greater importance in shaping the clean energy transition. While in theory, the Greenhouse Gas Protocol solely develops accounting rules and does not force companies to take any particular action, it’s undeniable that its decisions will set the stage for the next chapter of decarbonization. That chapter could either be about solving for round-the-clock clean power, or just trying to keep corporate clean energy investment flowing and growing, hopefully with higher integrity.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Giving up on hourly matching by 2030 doesn’t mean giving up on climate ambition — necessarily.
Microsoft celebrated a “milestone achievement” earlier this year, when it announced that it had successfully matched 100% of its 2025 electricity usage with renewable energy. This past week, however, Bloomberg reported that the company was considering delaying or abandoning its next clean energy target set for 2030.
What comes after achieving 100% renewable energy, you might ask? What Microsoft did in 2025 was tally its annual energy consumption and purchase an equal amount of solar and wind power. By 2030, the company aspired to match every kilowatt it consumes with carbon-free electricity hour by hour. That means finding clean power for all the hours when the sun isn’t shining and the wind isn’t blowing.
The news that Microsoft is revisiting this goal could be read as the beginning of the end of corporate climate ambition. Microsoft has long been a pioneer on that front, setting increasingly difficult goals and then doing the groundwork to help others follow in its footsteps. Now it appears to be accepting defeat. The news comes just weeks after my colleague Robinson Meyer broke the news that the company is also pausing its industry-leading carbon removal purchasing program.
Delaying or abandoning the clean energy target — the two options presented in the Bloomberg story — represent quite different scenarios, however.
“There’s going to be a big difference between them saying, We’re going to keep trying as hard as we can to go as far as we can, but acknowledge we may not hit it, versus saying, Well, we can’t hit this extremely ambitious goal we set for ourselves, therefore we’re just giving up on the overall mission,” Wilson Ricks, a manager in Clean Air Task Force’s electricity program, told me.
The goal was always going to be difficult, if not impossible, for Microsoft to hit, Ricks said. Yes, it’s gotten tougher as Microsoft’s electricity usage has surged with the rise of artificial intelligence, and because Congress killed subsidies for clean energy as the Trump administration has done its best to stall wind and solar development. But some of the technologies likely needed to achieve the goal, such as advanced nuclear and geothermal power plants, have yet to achieve commercial deployment, let alone reach meaningful scale, and probably won’t by 2030 — especially not across all the regions that Microsoft operates in.
Nonetheless, some clean energy advocates (including Ricks) argue that keeping hourly matching as a north star is paramount because it helps put the world on the path to fully decarbonized electric grids.
Google was the first to introduce a 24/7 carbon-free energy strategy in 2020, and for a moment, it seemed that the rest of the corporate world would follow. A handful of companies joined a coalition to support the goal, but to date, I’m aware of just two — Microsoft and the data storage company Iron Mountain — that have followed Google in committing to achieving it.
Most companies approach their clean energy claims with considerably less precision. The norm is to purchase “unbundled” renewable energy certificates, tradeable vouchers that say a certain amount of renewable energy has been generated somewhere, at some point, and that the certificate owner can lay claim to it. Many simply buy enough of these RECs to cover their annual electricity usage and call themselves “powered by 100% renewable energy.”
There’s a spectrum of quality in the RECs available for purchase, but the market is flooded with cheap, relatively meaningless certificates. A company that operates in a coal-heavy region like Indiana can buy RECs from a wind farm in Texas that was built a decade ago, which won’t do anything to change the makeup of the grid in either place.
Today, the gold standard for companies with capital to throw around is instead to seek out long-term contracts directly with wind and solar developers known as power purchase agreements. That doesn’t mean the wind and solar farms send power to the companies directly. But these types of contracts are more likely to bring new projects onto the grid by providing guaranteed future revenues, helping developers secure the financing they need to build.
Microsoft started buying unbundled RECs more than a decade ago, and in 2014, it reported it had matched all of its global electricity usage. In 2016, the company began setting goals for direct procurement of renewable energy. In 2020, it pledged to achieve 100% renewable this way by 2025 — but it wasn’t going to sign just any wind or solar agreements. It aimed to pursue contracts with projects that were in the same regions as the company’s operations and that wouldn’t have been built without the company’s support. “Where and how you buy matters,” it wrote in its 2020 sustainability report. “The closer the new wind or solar farm is to your data center, the more likely it is those zero carbon electrons are powering it.”
In 2021, Microsoft upped the ante again by establishing its 2030 hourly matching target, which it referred to as “100/100/0” — 100% of electrons, 100% of the time, zero-carbon energy.
Microsoft has never publicly reported its progress toward the 2030 goal. The company’s enthusiasm for the target has also appeared to wane. In 2020, before Microsoft even made the 100/100/0 commitment, it touted a solution it developed to track and match renewable energy generation and consumption on an hourly basis. In the years since, it has led its peers in investments in round-the-clock nuclear power, even signing a 20-year power purchase agreement with Constellation Energy to bring the shuttered Three Mile Island nuclear plant in Pennsylvania back online.
But Microsoft has stopped publicizing the goal in blog posts and press releases. It went unmentioned in the recent announcement about the 2025 renewable energy achievement, for instance. And a section in the company’s annual sustainability report listing its climate targets that had previously advertised the 2030 goal as “Replacing with 100/100/0 carbon-free energy” was re-written in 2025 as “Expanding carbon-free electricity,” fuzzier rhetoric that now reads as a harbinger of a softer approach.
Microsoft did not respond to questions about its progress toward the 2030 target. In an emailed statement, a spokesperson emphasized the company’s commitment to maintaining its annual matching goal — the one achieved in 2025. No doubt that will take a lot more investment in the years to come now that the company is gobbling up a lot more electricity for data centers — some of it directly from natural gas plants.
Microsoft also shared a statement from Melanie Nakagawa, Microsoft’s chief sustainability officer, emphasizing the company’s commitment to become carbon negative. “At times we may make adjustments to our approach toward our sustainability goals,” she said. “Any adjustments we make are part of our disciplined approach—not a change in our long-term ambition.”
Even if Microsoft axes its hourly matching target, the company might have to start reporting its clean electricity usage on an hourly basis anyway. The Greenhouse Gas Protocol, a nonprofit that sets standards for how companies should calculate their emissions, is currently considering adopting an hourly accounting requirement. While the protocol’s standards are voluntary, companies almost uniformly follow them, and they will soon become mandatory in much of the world, as governments in California and Europe plan to integrate them into corporate disclosure rules.
The accounting rule change is highly controversial, with many companies arguing that it will deter them from investing in clean energy altogether, since their purchases won’t look as good on paper. “I don’t think anybody is debating having rules and guidelines around how you do more narrow matching, we should have that,” Michael Leggett, the co-founder and chief product officer for Ever.Green, a company that sells high-impact RECs, told me. “I think the debate has largely been around, is that required?”
Leggett said he could see how Microsoft’s pullback could be twisted to support either side. Proponents of the hourly accounting method will say, “Aha! See? This is why we have to require it.” Opponents will say, “See, even Microsoft can’t do it, so how are you going to require all these other companies to do it?”
I spoke to Alex Piper, the head of U.S. policy and markets at EnergyTag, a nonprofit that advocates for reforms to enable 24/7 clean energy, who saw the news as vindicating.
“What we’re seeing right now is many of the hyperscale technology companies look to the fastest path to power, and whether it is or not, some of them are turning to gas as that solution,” he told me. Piper argued that companies are choosing natural gas in part because they can get away with clean energy claims under the protocol’s existing rules. “The proposed rules for the greenhouse gas protocol would require those companies to at least be transparent.”
But Microsoft walking back its hourly matching goal does not have to mean that it’s walking back its climate ambition. It’s possible for companies to achieve significant emissions reductions by focusing their clean energy purchases on the places where wind and solar will do the most to displace fossil fuels, rather than worrying about matching every hour. For a company that operates in California, for example, supporting the addition of solar power to a coal-heavy grid — even if it’s in a different part of the country or the world — will do more, faster, than helping to build solar locally or waiting for around-the-clock resources such as geothermal power to come online.
Critics of hourly accounting argue that it doesn’t give companies credit for this kind of approach. “What I would love to have happen is anything to incentivize, recognize, and reward companies signing 20-year contracts that enable new projects coming online,” Leggett said of the Greenhouse Gas Protocol’s forthcoming rule change.
Ricks, of Clean Air Task Force, rejects the idea that an hourly accounting requirement would deter these kinds of deals. “That doesn’t mean that they can’t report any other set of numbers they want to,” he said. “Many companies do report things that aren’t currently recognized in the Greenhouse Gas Protocol.”
Microsoft is a prime example. The company includes two measures of its renewable energy usage in its annual reports: “percentage of renewable electricity,” which includes the unbundled RECs Microsoft has continued to buy over the years, and “percentage of direct renewable electricity,” which tracks power purchase agreements and the renewable portion of the grid mix where its facilities are located. The former uses the Greenhouse Gas protocol’s current accounting method, under which Microsoft says it has hit 100% every year since 2014. But the latter is the company’s own bespoke calculation.
The company’s 2025 feat was based on this made-up methodology, and it represents the first time Microsoft has announced to the world that it used 100% renewable energy. It never previously made such claims about its REC purchases, as far as I can tell. In other words, Microsoft’s standards for what it publicizes are far more rigorous than what the Greenhouse Gas Protocol requires.
Regardless of what the protocol decides, it will determine only what companies must report. It won’t prevent them from offering up their own, additional metrics of success.
PJM Interconnection has some ideas, as does the state of New Jersey.
We’ve already talked this week about Pennsylvania asking whether the modern “regulatory compact,” which grants utilities monopoly geographical franchises and regulated returns from their capital investments, is still suitable in this era of rising prices and data-center-driven load growth.
Now America’s biggest electricity market and another one of that market’s biggest states are considering far-reaching, fundamental reforms that could alter how electricity infrastructure is planned and paid for over 65 million Americans.
New Jersey Governor Mikie Sherrill anchored her 2025 campaign on electricity prices, and for good reason — in the past four years, electricity prices in the state have gone up 48%, according to Heatmap and MIT’s Electricity Price Hub, while average bills have risen from $83 per month to $130. On her first day in office, Sherrill issued two executive orders acting on that promise, directing the state to make funds available to freeze rates and declaring a state of emergency to ease the way to building more generation.
Included in that first order was a review of utility business models to be carried out by state regulators. What that review will entail is now coming into focus.
On Wednesday, the New Jersey Board of Public Utilities issued a statement announcing that it will look specifically at “whether New Jersey’s century-old utility business model — one that rewards electric distribution companies (EDCs) for capital spending even when cheaper alternatives exist — should be replaced with a framework tied to performance, affordability, and long-term cost stability.” In case anyone was still ambiguous as to what the outcome of said study might be, the board added that it is “expected to drive the most significant restructuring of utility regulation in New Jersey in decades.”
The current system, the board’s president Christine Guhl-Savoy said at a hearing Thursday, “creates a structural incentive to favor capital intensive solutions, even when lower costs, non-wires or demand side alternatives may be available.”
This structure, she said, could help explain why “over the past decade, electric delivery charges in New Jersey have risen steadily.” Within the service territory of PSEG, one of the four major New Jersey utilities, distribution charges alone have risen from $19.24 per month in January 2020 (as far back as the Heatmap-MIT data goes) to $21.84 as of April, while transmission charges have risen from around $20 to just over $29 per month. Many critics of the utility business model point to high levels of local grid spending on distribution as a way that utilities pad their earnings with returns harvested from ratepayers.
In the system regulators explored at the hearing, new projects would get a more skeptical look and ratepayers payouts would be partially determined by utilities hitting pre-defined service goals. NJBPU executive director Bob Brabston also indicated that the review process would take a close look at utilities’ regulated returns on equity — echoing his neighbor across the Delaware River, Pennsylvania Governor Josh Shapiro, who wrote in a letter to his state’s utilities earlier this week that these returns must be “transparent” and “justifiable,” and no longer be based on “educated guesses.”
“We want to make sure that the actual cost of equity and the returns on equity are close,” Brabston said Thursday. “We don’t want there to be a significant gap between the cost of equity that you all experience and the returns that the agencies that the agency awards.”
Meanwhile, in Valley Forge, Pennsylvania, the framework within which New Jersey’s utilities exist is coming in for its own examination.
PJM Interconnection — the nation’s largest electricity market, which covers not just Pennsylvania and New Jersey but also part or all of 11 other states — released an almost 70-page paper Wednesday, in which the organization’s president David Mills wrote that “the current situation is not tenable.”
PJM has been the poster child for a host of issues plaguing the electricity markets across the country, including fast-rising prices, a failure to quickly bring on new generation, and an inability to assure the market’s preferred level of reserve reliability. This set of challenges, Mills said in the paper’s introduction, “reflects something more fundamental than a design that needs recalibration.” Instead, PJM must consider “whether the foundational assumptions of the market remain valid – and if not, what a valid set of assumptions would require.”
The problem with the electricity market, he argued, can be solved by more markets. Right now, when prices shoot up, governments intervene with price caps, suppressing the market signal necessary to bring on sufficient generation that would bring down prices.
To replace that system, the paper proposes three possible models. The first, which it calls “Stabilized Markets,” would allow capacity to be procured for several years at a time outside of the current auction system, so that utilities could make sure their basic needs were covered before they go into the annual auctions. This would provide long term security for new investment.
The second path would be a more fundamental reform. This “Differential Reliability” approach would do away with the “shared reliability compact,” under which all loads must be served by the system at all times. Instead, PJM would “develop the operational and commercial framework to explicitly differentiate reliability,” incentivizing approaches like bring your own generation or curtailing power for new large sources of demand.
The third path is an “Energy Market Transition,” which might also be called the “Texas option.” Following this path, the capacity market would shrink as a portion of revenues earned by generators, and more revenue would come from real-time or near-real-time electricity sales.
While this path isn’t “full Texas” (ERCOT doesn’t have a capacity market at all), it would mean allowing for higher prices for energy in real-time, a.k.a. “scarcity pricing” which is arguably the defining feature of the ERCOT system (though even that was scaled back when prices got too high).
“The choices embedded in these paths involve genuine trade-offs, and those trade-offs affect different stakeholders uniquely,” the paper says.If PJM has learned anything in the past few years, it’s that it doesn’t get to make decisions on its own. Those stakeholders will get their say, one way or another.
Big fundraises for Nyobolt and Skeleton Technologies, plus more of the week’s biggest money moves.
Following a quiet week for new deals, the industry is back at it with a bunch of capital flowing into some of the industry’s most active areas. My colleague Alexander C. Kaufman already told you about one of the more buzzworthy announcements from data center-land in Wednesday’s AM newsletter: Wave energy startup Panthalassa raised $140 million in a round led by Peter Thiel to “perform AI inference computing at sea” using nodes powered by the ocean’s waves.
This week also saw fresh funding for more conventional data center infrastructure, as Nyobolt and Skeleton Technologies both announced later-stage rounds for data center backup power solutions. Meanwhile, it turns out Redwood Materials is not the only company bringing in significant capital for second-life EV battery systems — Moment Energy just raised $40 million to pursue a similar approach. Elsewhere, investors backed an effort to rebuild domestic magnesium production, and, in a glimmer of hope for a sector on the outs, gave a boost to green cement startup Terra CO2.
Cambridge-based startup Nyobolt has become the latest battery company to reach a $1 billion valuation, with its expansion into the data center market helping fuel excitement around its tech. Spun out of University of Cambridge research in 2019, the company develops ultra-fast-charging batteries based on a modified lithium-ion chemistry. Its core innovation is an anode made from niobium tungsten oxide, which Nyobolt says enables its batteries to charge to 80% in less than five minutes, with a cycle life that’s 10 times longer than conventional lithium-ion, all without the risk of fire.
The company has now raised a $60 Series C, following what it describes as a period of “rapid commercial momentum,” with revenue increasing five-fold year-over-year as customers in the robotics and data center industry piled in. Symbotic, an autonomous robotics company and existing customer, led the latest round. While Symbotic previously relied on supercapacitors to power its robots, Nyobolt’s says its batteries provide six times more energy capacity in a lighter package, allowing its warehouse robots to work for retailers like Walgreens, Target, and Kroger around the clock.
Now the startup is targeting data center customers too, positioning its tech as a fast-acting fix for the sudden power surges common to large-scale artificial intelligence workloads, as well as a temporary backup power solution for outages. While it has no confirmed domestic data center customers to date, it does have a nonbinding agreement with the Indian state of Rajasthan to deploy over 100 megawatts of off-grid AI data center and power management infrastructure, part of a broader push to expand its presence across the country.
Notably, the press release made no mention of plans to sell its tech to electric vehicle automakers, though this appears to have been a central focus previously. As recently as last summer, executive vice president Ramesh Narasimhan told the BBC that he hoped Nyobolt’s batteries would “transform the experience of owning an EV.” But while its tech does enable extremely fast charging, its underlying chemistry is not optimized for long-range driving. A sports car built to test the company’s batteries had just a 155 mile range. So like many of its climate tech peers, the company appears to be betting that data centers now represent a more reliable opportunity.
This week brought additional news from another European player aiming to smooth out data center power surges. Estonia-based supercapacitor startup Skeleton Technologies raised $39 million in what it describes as the first close of a pre-IPO funding round, with a U.S. listing planned for next year. Its core tech is built around a “curved graphene” structure, which the company likens to a crumpled sheet of paper with a high surface area. The graphene’s many exposed surfaces and edges allows it to hold more electric charge, which Skeleton says delivers a 72% improvement in energy density.
Like Nyobolt, Skeleton says its tech offers faster response times and longer cycle life. But supercapacitors are a fundamentally different technology than Nyobolt’s modified lithium-ion solution. Though they offer near-instantaneous response times, they store very little energy — just enough to smooth out microsecond power spikes in GPU workloads. Nyobolt’s batteries, by contrast, aim not only to smooth out data center power spikes, but also to deliver about 90 seconds of backup power in the case of an outage, before a generator or other backup source kicks in.
Skeleton is already mass-producing supercapacitors in Germany and delivering to unnamed “major U.S. hyperscalers for AI infrastructure.” It’s also making moves to expand its U.S. footprint ahead of its pending IPO, opening an engineering facility in Houston and aiming to begin domestic manufacturing of AI data center solutions in the first half of this year.
Last year brought a wave of new climate tech coalitions, with one of the most ambitious efforts known as the All Aboard Coalition. This group of venture firms is targeting the investment gap known as the missing middle, which falls between early-stage venture rounds and infrastructure funding. The model is relatively mechanical: When three or more member firms participate in a later-stage round for a company, the coalition automatically coinvests out of its own fund, matching the members’ combined contribution.
The group made its first investment in January, supporting the AI-powered geothermal exploration and development company Zanskar’s Series C round. This week, it announced its second: a $22 million commitment to low-carbon cement startup Terra CO2, bringing the company’s Series B total to $147 million. Cement production accounts for roughly 8% of global emissions, a figure Terra aims to shrink by making so-called "supplementary cementitious materials” — which can partially displace traditional cement in concrete mixes — from abundant silicate rocks. By grinding and thermally processing these rocks into a glassy powder, Terra’s product mimics the properties of conventional cement. The company says it can replace up to 50% of the cement in typical concrete mixes, lowering associated emissions by as much as 70%.
The new funding will help Terra build its first commercial-scale plant in Texas, exactly the type of first-of-a-kind project that the coalition was designed to support. But the scale of this challenge remains clear. As noted in ImpactAlpha’s coverage, the coalition has raised just $100 million toward its goal of a $300 million fund — already a relatively modest goal considering the capital intensity of novel infrastructure projects. Bloomberg previously reported that the group aimed to raise the full amount by the end of October 2025, raising questions about the willingness of LPs to bet on projects at this crucial but capital-intensive juncture.
When I think about repurposing used electric vehicle batteries for stationary storage, I think of battery recycling giant Redwood Materials, which raised a $425 million Series E in January after moving aggressively into this promising market. But while Redwood’s well-established recycling business certainly provides it with the largest pipeline of used batteries, it’s far from the only company pursuing this business model. A smaller player with a largely similar approach underscored that this week, when it announced a $40 million Series B to scale its gigafactory in Texas and expand its facilities in British Columbia.
That’s Moment Energy, which focuses on using second-life EV batteries to power commercial and industrial sites such as data centers, hospitals, and factories. Like Redwood, it relies on proprietary software to aggregate battery packs with myriad chemistries and design specs into coordinated grid-scale systems. What the company sees as its critical differentiator, however, is its safety standards. Moment has achieved UL certification, a key safety benchmark that it says others in the industry have yet to meet.
In a shot at its competitors, the company described itself in a press release as the “only provider proven capable of deploying second-life battery storage systems in the built environment without special dispensations or regulatory loopholes.” While Moment never names names, Redwood’s first commercial-scale system sits on its own private land in an open air setting, where certification is arguably unnecessary. “What most other second life [battery] companies are now trying to say is, let’s just lobby to make second life UL certification easier, because it is impossible to get UL certification, as it stands,” the company’s CEO, Edward Chiang, told TechCrunch. “But at Moment, we say that’s not true. We got it.”
As I wrote last September, it’s a good time to be a critical minerals startup, because as you may have heard, “critical minerals are the new oil.” These materials sit at the center of modern energy infrastructure — batteries, magnets, photovoltaic cells, and electrical wiring, to name just a few uses — plus their supply is concentrated in geopolitically tense regions and subject to extreme price volatility. It also certainly doesn’t hurt that the Trump administration loves them and wants to mine and refine way more of them in the U.S.
The latest beneficiary of this enthusiasm is Magrathea, which this week raised a $24 million Series A to build what it says will be the only new magnesium smelter in the U.S., in Arkansas. The company has now raised over $100 million in total, including a $28 million grant from the Department of Defense. Its approach relies on an electrolysis-based process that’s able to extract pure magnesium from seawater and brines, which it positions as a cleaner, cheaper alternative to the high-heat, emission-intensive method that China uses to produce most of the world’s magnesium today.
The U.S. military has taken note of this potential new domestic supply. Magrathea’s 2022 seed round coincided with Russia’s invasion of Ukraine, as the military looked to scale domestic defense tech supply chains. Magnesium alloys are often used to help reduce weight in EV components, a benefit equally applicable to military helicopters, drones, and next-generation fighter jets. So while these defense applications represent somewhat of a pivot from the startup’s initial focus, a greener fighter jet is still better than a dirty fighter jet.