You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
With the ongoing disaster approaching its second week, here’s where things stand.

A week ago, forecasters in Southern California warned residents of Los Angeles that conditions would be dry, windy, and conducive to wildfires. How bad things have gotten, though, has taken everyone by surprise. As of Monday morning, almost 40,000 acres of Los Angeles County have burned in six separate fires, the biggest of which, Palisades and Eaton, have yet to be fully contained. The latest red flag warning, indicating fire weather, won’t expire until Wednesday.
Many have questions about how the second-biggest city in the country is facing such unbelievable devastation (some of these questions, perhaps, being more politically motivated than others). Below, we’ve tried to collect as many answers as possible — including a bit of good news about what lies ahead.
A second Santa Ana wind event is due to set in Monday afternoon. “We’re expecting moderate Santa Ana winds over the next few days, generally in the 20 to 30 [mile per hour] range, gusting to 50, across the mountains and through the canyons,” Eric Drewitz, a meteorologist with the Forest Service, told me on Sunday. Drewitz noted that the winds will be less severe than last week’s, when the fires flared up, but he also anticipates they’ll be “more easterly,” which could blow the fires into new areas. A new red flag warning has been issued through Wednesday, signaling increased fire potential due to low humidity and high winds for several days yet.
If firefighters can prevent new flare-ups and hold back the fires through that wind event, they might be in good shape. By Friday of this week, “it looks like we could have some moderate onshore flow,” Drewitz said, when wet ocean air blows inland, which would help “build back the marine layer” and increase the relative humidity in the region, decreasing the chances of more fires. Information about the Santa Anas at that time is still uncertain — the models have been changing, and the wind is tricky to predict the strength of so far out — but an increase in humidity will at least offer some relief for the battered Ventura and Orange Counties.
The Palisades Fire, the biggest in L.A., ripped through the hilly and affluent area between Santa Monica and Malibu, including the Pacific Palisades neighborhood, the second-most expensive zip code in Los Angeles and home to many celebrities. Structures in Big Rock, a neighborhood in Malibu, have also burned. The fire has also encroached on the I-405 and the Getty Villa, and destroyed at least two homes in Mandeville Canyon, a neighborhood of multimillion-dollar homes. Students at nearby University of California, Los Angeles, were told on Friday to prepare for a possible evacuation.
The Eaton Fire, the second biggest blaze in the area, has killed 16 people in Altadena, a neighborhood near Pasadena, according to the Los Angeles Times, making it one of the deadliest fires in the modern history of California.
The 1,000-acre Kenneth fire is 100% contained but still burning near Calabasas and the gated community of Hidden Hills. The Hurst Fire has burned nearly 800 acres and is 89% contained and is still burning near Sylmar, the northernmost neighborhood in L.A. Though there are no evacuation notices for either the Kenneth or the Hurst fires, residents in the L.A. area should monitor the current conditions as the situation continues to be fluid and develop.
The 43-acre Sunset Fire, which triggered evacuations last week in Hollywood and Hollywood Hills, burned no homes and is 100% contained.
The Lidia Fire, which ignited in a remote area south of Acton, California, on Wednesday afternoon, burned 350 acres of brush and is 100% contained.
It can take years to determine the cause of a fire, and investigations typically don’t begin until after the fire is under control and the area is safe to reenter, Edward Nordskog, a retired fire investigator from the Los Angeles Sheriff’s Department, told Heatmap’s Emily Pontecorvo. He also noted, however, that urban fires are typically easier to pinpoint the cause of than wildland fires due to the availability of witnesses and surveillance footage.
The vast majority of wildfires, 85%, are caused by humans. So far, investigators have ruled out lightning — another common fire-starter — because there were no electrical storms in the area when the fires started. In the case of the Palisades Fire, there were no power lines in the area of the ignition, though investigators are now looking into an electrical transmission tower in Eaton Canyon as the possible cause of the deadly fire in Altadena. There have been rumors that arsonists started the fires, but investigators say that scenario is also pretty unlikely due to the spread of the fires and how remote the ignition areas are.
Officially, 24 people have died, but that tally is likely to rise. California Governor Gavin Newsom said Sunday that he expects “a lot more” deaths will be added to the total in the coming days as search efforts continue.
Incoming President Donald Trump slammed the response to the L.A. fires in a Truth Social post on Sunday morning: “This is one of the worst catastrophes in the history of our Country,” he wrote. “They just can’t put out the fires. What’s wrong with them?”
Though there is much blame going around — not all of it founded in reality — the challenges facing firefighters are immense. Last week, because of strong Santa Ana winds, fire crews could not drop suppressants like water or chemical retardant on the initial blazes. (In strong winds, water and retardant will blow away before they reach the flames on the ground.)
Fighting a fire in an urban or suburban area is also different from fighting one in a remote, wild area. In a true wildfire, crews don’t use much water; firefighters typically contain the blazes by creating breaks — areas cleared of vegetation that starve a fire of fuel and keep it from spreading. In an urban or suburban event, however, firefighters can’t simply hack through a neighborhood, and typically have to use water to fight structure fires. Their priority also shifts from stopping the fire to evacuating and saving people, which means putting out the fire itself has to wait.
What’s more, the L.A. area faced dangerous fire weather going into last week — with wind gusts up to 100 miles per hour and dry air — and the persistence of the Santa Ana winds during firefighting operations through the weekend made it extremely difficult for emergency managers to gain a foothold.
Trump and others have criticized Los Angeles for being unprepared for the fires, given reports that some fire hydrants ran dry or had low pressure during operations in Pacific Palisades. According to the Los Angeles Department of Water and Power, about 20% of hydrants were affected, mostly at higher elevations.
The problem isn’t a lack of preparation, however. It’s that the L.A. wildfires are so large and widespread, the county’s preparations were quickly overwhelmed. “We’re fighting a wildfire with urban water systems, and that is really challenging,” Los Angeles Department of Water and Power CEO Janisse Quiñones said in a news conference last week. When houses burn down, water mains can break open. Civilians also put a strain on the system when they use hoses or sprinkler systems to try to protect their homes.
On Sunday, Judy Chu, the Democratic lawmaker representing Altadena, confirmed that fire officials had told her there was enough water to continue the battle in the days ahead. “I believe that we're in a good place right now,” she told reporters. Newsom, meanwhile, has responded to criticism over the water failure by ordering an investigation into the weak or dry hydrants.
So-called “super soaker” planes have had no problem with water access; they’re scooping directly from the ocean.
Yes. Although aerial support was grounded in the early stages of the wildfires due to severe Santa Ana winds, flights resumed during lulls in the storms last week.
There is a misconception, though, that water and retardant drops “put out” fires; they don’t. Instead, aerial support suppresses a fire so crews can get in close and use traditional methods, like cutting a fire break or spraying water. “All that up in the air, all that’s doing is allowing the firefighters [on the ground] a chance to get in,” Bobbie Scopa, a veteran firefighter and author of the memoir Both Sides of the Fire Line, told me last week.
With winds expected to pick up early this week, aerial firefighting operations may be grounded again. “If you have erratic, unpredictable winds to where you’ve got a gust spread of like 20 to 30 knots,” i.e. 23 to 35 miles per hour, “that becomes dangerous,” Dan Reese, a veteran firefighter and the founder and president of the International Wildfire Consulting Group, told me on Friday.
Because of the direction of the Santa Ana winds, wildfire smoke should mostly blow out to sea. But as winds shift, unhealthy air can blow into populated areas, affecting the health of residents.
Wildfire smoke is unhealthy, period, but urban and suburban smoke like that from the L.A. fires can be particularly detrimental. It’s not just trees and brush immolating in an urban fire, it’s also cars, and batteries, and gas tanks, and plastics, and insulation, and other nasty, chemical-filled things catching fire and sending fumes into the air. PM2.5, the inhalable particulates from wildfire smoke, contributes to thousands of excess deaths annually in the U.S.
You can read Heatmap’s guide to staying safe during extreme smoke events here.
“The bad news is, I’m not seeing any rain chances,” Drewitz, the Forest Service meteorologist, told me on Sunday. Though the marine layer will bring wetter air to the Los Angeles area on Friday, his models showed it’ll be unlikely to form precipitation.
Though some forecasters have signaled potential rain at the end of next week, the general consensus is that the odds for that are low, and that any rain there may be will be too light or short-lived to contribute meaningfully to extinguishing the fires.
The chaparral shrublands around Los Angeles are supposed to burn every 30 to 130 years. “There are high concentrations of terpenes — very flammable oils — in that vegetation; it’s made to burn,” Scopa, the veteran firefighter, told me.
What isn’t normal, though, is the amount of rain Los Angeles got ahead of this past spring — 52.46 inches in the preceding two years, the wettest period in the city’s history since the late 1800s — which was followed by a blisteringly hot summer and a delayed start to this year’s rainy season. Since October, parts of Southern California have received just 10% of their normal rainfall
This “weather whiplash” is caused by a warmer atmosphere, which means that plants will grow explosively due to the influx of rain and then dry out when the drought returns, leaving lots of dry fuels ready and waiting for a spark. “This is really, I would argue, a signature of climate change that is going to be experienced almost everywhere people actually live on Earth,” Daniel Swain, a climate scientist at the University of California, Los Angeles, who authored a new study on the pattern, told The Washington Post.
We know less about how climate change may affect the Santa Anas, though experts have some theories.
At least 12,000 structures have burned so far in the fires, which is already exacerbating the strain on the Los Angeles housing market — one of the country’s tightest even before the fires — as thousands of displaced people look for new places to live. “Dozens and dozens of people are going after the same properties,” one real estate agent told the Los Angeles Times. The city has reminded businesses that price gouging — including raising rental prices more than 10% — during an emergency is against the law.
Los Angeles had a shortage of about 370,000 homes before the fires, and between 2021 and 2023, the county added fewer than 30,000 new units per year. Recovery grants and federal aid can lag, and it often takes more than two years for even the first Housing and Urban Development Disaster Recovery Grants’ expenditures to go out.
My colleague Matthew Zeitlin wrote for Heatmap that the economic impact of the Los Angeles fire is already much higher than that of other fires, such as the 2018 Camp fire, partly because of the value of the Pacific Palisades real estate.
The wildfires may “deal a devastating blow to [California’s] fragile home insurance market,” Heatmap’s Matthew Zeitlin wrote last week. In recent years, home insurers have left California or declined to write new policies, at least partially due to the increased risk of wildfires in the state.
Depending on the extent of the damage from the fires, the coffers of California’s FAIR Plan — which insures homeowners who can’t get insurance otherwise, including many in Pacific Palisades and Altadena — could empty, causing it to seek money from insurers, according to the state’s regulations. As Zeitlin writes, “This would mean that Californians who were able to buy private insurance — because they don’t live in a region of the state that insurers have abandoned — could be on the hook for massive wildfire losses.”
First and foremost, sign up for all relevant emergency alerts. Make sure to turn on the sound on your phone and keep it near you in case of a change in conditions. Pack a “go bag” with essentials and consider filling your gas tank now so that you can evacuate at a moment’s notice if needed. Read our guide on what to do if you get a pre-evacuation or an evacuation notice ahead of time so that you’re not scrambling for information if you get an alert.
The free Watch Duty app has become a go-to resource for people affected by the fires, including friends and family of Angelenos who may themselves be thousands of miles away. The app provides information on fire perimeters, evacuation notices, and power outages. Its employees pull information directly from emergency responders’ radio broadcasts and sometimes beat official sources to disseminating it. If you need an endorsement: Emergency responders rely on the app, too.
There are many scams in the wake of disasters as crooks look to take advantage of desperate people — and those who want to help them. To play it safe, you can use a hub like the one established by GoFundMe, which is actively vetting campaigns related to the L.A. fires. If you’re looking to volunteer your time, make a donation of clothing or food, or if you’re able to foster animals the fire has displaced, you can use this handy database from the Mutual Aid Network L.A. There are also many national organizations, such as the Red Cross, that you can connect with if you want to help.
The City of Los Angeles and the Los Angeles Fire Department have asked that do-gooders not bring donations directly to fire stations or shelters; such actions can interfere with emergency operations. Their website provides more information about how you can help — productively — on their website.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Giving up on hourly matching by 2030 doesn’t mean giving up on climate ambition — necessarily.
Microsoft celebrated a “milestone achievement” earlier this year, when it announced that it had successfully matched 100% of its 2025 electricity usage with renewable energy. This past week, however, Bloomberg reported that the company was considering delaying or abandoning its next clean energy target set for 2030.
What comes after achieving 100% renewable energy, you might ask? What Microsoft did in 2025 was tally its annual energy consumption and purchase an equal amount of solar and wind power. By 2030, the company aspired to match every kilowatt it consumes with carbon-free electricity hour by hour. That means finding clean power for all the hours when the sun isn’t shining and the wind isn’t blowing.
The news that Microsoft is revisiting this goal could be read as the beginning of the end of corporate climate ambition. Microsoft has long been a pioneer on that front, setting increasingly difficult goals and then doing the groundwork to help others follow in its footsteps. Now it appears to be accepting defeat. The news comes just weeks after my colleague Robinson Meyer broke the news that the company is also pausing its industry-leading carbon removal purchasing program.
Delaying or abandoning the clean energy target — the two options presented in the Bloomberg story — represent quite different scenarios, however.
“There’s going to be a big difference between them saying, We’re going to keep trying as hard as we can to go as far as we can, but acknowledge we may not hit it, versus saying, Well, we can’t hit this extremely ambitious goal we set for ourselves, therefore we’re just giving up on the overall mission,” Wilson Ricks, a manager in Clean Air Task Force’s electricity program, told me.
The goal was always going to be difficult, if not impossible, for Microsoft to hit, Ricks said. Yes, it’s gotten tougher as Microsoft’s electricity usage has surged with the rise of artificial intelligence, and because Congress killed subsidies for clean energy as the Trump administration has done its best to stall wind and solar development. But some of the technologies likely needed to achieve the goal, such as advanced nuclear and geothermal power plants, have yet to achieve commercial deployment, let alone reach meaningful scale, and probably won’t by 2030 — especially not across all the regions that Microsoft operates in.
Nonetheless, some clean energy advocates (including Ricks) argue that keeping hourly matching as a north star is paramount because it helps put the world on the path to fully decarbonized electric grids.
Google was the first to introduce a 24/7 carbon-free energy strategy in 2020, and for a moment, it seemed that the rest of the corporate world would follow. A handful of companies joined a coalition to support the goal, but to date, I’m aware of just two — Microsoft and the data storage company Iron Mountain — that have followed Google in committing to achieving it.
Most companies approach their clean energy claims with considerably less precision. The norm is to purchase “unbundled” renewable energy certificates, tradeable vouchers that say a certain amount of renewable energy has been generated somewhere, at some point, and that the certificate owner can lay claim to it. Many simply buy enough of these RECs to cover their annual electricity usage and call themselves “powered by 100% renewable energy.”
There’s a spectrum of quality in the RECs available for purchase, but the market is flooded with cheap, relatively meaningless certificates. A company that operates in a coal-heavy region like Indiana can buy RECs from a wind farm in Texas that was built a decade ago, which won’t do anything to change the makeup of the grid in either place.
Today, the gold standard for companies with capital to throw around is instead to seek out long-term contracts directly with wind and solar developers known as power purchase agreements. That doesn’t mean the wind and solar farms send power to the companies directly. But these types of contracts are more likely to bring new projects onto the grid by providing guaranteed future revenues, helping developers secure the financing they need to build.
Microsoft started buying unbundled RECs more than a decade ago, and in 2014, it reported it had matched all of its global electricity usage. In 2016, the company began setting goals for direct procurement of renewable energy. In 2020, it pledged to achieve 100% renewable this way by 2025 — but it wasn’t going to sign just any wind or solar agreements. It aimed to pursue contracts with projects that were in the same regions as the company’s operations and that wouldn’t have been built without the company’s support. “Where and how you buy matters,” it wrote in its 2020 sustainability report. “The closer the new wind or solar farm is to your data center, the more likely it is those zero carbon electrons are powering it.”
In 2021, Microsoft upped the ante again by establishing its 2030 hourly matching target, which it referred to as “100/100/0” — 100% of electrons, 100% of the time, zero-carbon energy.
Microsoft has never publicly reported its progress toward the 2030 goal. The company’s enthusiasm for the target has also appeared to wane. In 2020, before Microsoft even made the 100/100/0 commitment, it touted a solution it developed to track and match renewable energy generation and consumption on an hourly basis. In the years since, it has led its peers in investments in round-the-clock nuclear power, even signing a 20-year power purchase agreement with Constellation Energy to bring the shuttered Three Mile Island nuclear plant in Pennsylvania back online.
But Microsoft has stopped publicizing the goal in blog posts and press releases. It went unmentioned in the recent announcement about the 2025 renewable energy achievement, for instance. And a section in the company’s annual sustainability report listing its climate targets that had previously advertised the 2030 goal as “Replacing with 100/100/0 carbon-free energy” was re-written in 2025 as “Expanding carbon-free electricity,” fuzzier rhetoric that now reads as a harbinger of a softer approach.
Microsoft did not respond to questions about its progress toward the 2030 target. In an emailed statement, a spokesperson emphasized the company’s commitment to maintaining its annual matching goal — the one achieved in 2025. No doubt that will take a lot more investment in the years to come now that the company is gobbling up a lot more electricity for data centers — some of it directly from natural gas plants.
Microsoft also shared a statement from Melanie Nakagawa, Microsoft’s chief sustainability officer, emphasizing the company’s commitment to become carbon negative. “At times we may make adjustments to our approach toward our sustainability goals,” she said. “Any adjustments we make are part of our disciplined approach—not a change in our long-term ambition.”
Even if Microsoft axes its hourly matching target, the company might have to start reporting its clean electricity usage on an hourly basis anyway. The Greenhouse Gas Protocol, a nonprofit that sets standards for how companies should calculate their emissions, is currently considering adopting an hourly accounting requirement. While the protocol’s standards are voluntary, companies almost uniformly follow them, and they will soon become mandatory in much of the world, as governments in California and Europe plan to integrate them into corporate disclosure rules.
The accounting rule change is highly controversial, with many companies arguing that it will deter them from investing in clean energy altogether, since their purchases won’t look as good on paper. “I don’t think anybody is debating having rules and guidelines around how you do more narrow matching, we should have that,” Michael Leggett, the co-founder and chief product officer for Ever.Green, a company that sells high-impact RECs, told me. “I think the debate has largely been around, is that required?”
Leggett said he could see how Microsoft’s pullback could be twisted to support either side. Proponents of the hourly accounting method will say, “Aha! See? This is why we have to require it.” Opponents will say, “See, even Microsoft can’t do it, so how are you going to require all these other companies to do it?”
I spoke to Alex Piper, the head of U.S. policy and markets at EnergyTag, a nonprofit that advocates for reforms to enable 24/7 clean energy, who saw the news as vindicating.
“What we’re seeing right now is many of the hyperscale technology companies look to the fastest path to power, and whether it is or not, some of them are turning to gas as that solution,” he told me. Piper argued that companies are choosing natural gas in part because they can get away with clean energy claims under the protocol’s existing rules. “The proposed rules for the greenhouse gas protocol would require those companies to at least be transparent.”
But Microsoft walking back its hourly matching goal does not have to mean that it’s walking back its climate ambition. It’s possible for companies to achieve significant emissions reductions by focusing their clean energy purchases on the places where wind and solar will do the most to displace fossil fuels, rather than worrying about matching every hour. For a company that operates in California, for example, supporting the addition of solar power to a coal-heavy grid — even if it’s in a different part of the country or the world — will do more, faster, than helping to build solar locally or waiting for around-the-clock resources such as geothermal power to come online.
Critics of hourly accounting argue that it doesn’t give companies credit for this kind of approach. “What I would love to have happen is anything to incentivize, recognize, and reward companies signing 20-year contracts that enable new projects coming online,” Leggett said of the Greenhouse Gas Protocol’s forthcoming rule change.
Ricks, of Clean Air Task Force, rejects the idea that an hourly accounting requirement would deter these kinds of deals. “That doesn’t mean that they can’t report any other set of numbers they want to,” he said. “Many companies do report things that aren’t currently recognized in the Greenhouse Gas Protocol.”
Microsoft is a prime example. The company includes two measures of its renewable energy usage in its annual reports: “percentage of renewable electricity,” which includes the unbundled RECs Microsoft has continued to buy over the years, and “percentage of direct renewable electricity,” which tracks power purchase agreements and the renewable portion of the grid mix where its facilities are located. The former uses the Greenhouse Gas protocol’s current accounting method, under which Microsoft says it has hit 100% every year since 2014. But the latter is the company’s own bespoke calculation.
The company’s 2025 feat was based on this made-up methodology, and it represents the first time Microsoft has announced to the world that it used 100% renewable energy. It never previously made such claims about its REC purchases, as far as I can tell. In other words, Microsoft’s standards for what it publicizes are far more rigorous than what the Greenhouse Gas Protocol requires.
Regardless of what the protocol decides, it will determine only what companies must report. It won’t prevent them from offering up their own, additional metrics of success.
PJM Interconnection has some ideas, as does the state of New Jersey.
We’ve already talked this week about Pennsylvania asking whether the modern “regulatory compact,” which grants utilities monopoly geographical franchises and regulated returns from their capital investments, is still suitable in this era of rising prices and data-center-driven load growth.
Now America’s biggest electricity market and another one of that market’s biggest states are considering far-reaching, fundamental reforms that could alter how electricity infrastructure is planned and paid for over 65 million Americans.
New Jersey Governor Mikie Sherrill anchored her 2025 campaign on electricity prices, and for good reason — in the past four years, electricity prices in the state have gone up 48%, according to Heatmap and MIT’s Electricity Price Hub, while average bills have risen from $83 per month to $130. On her first day in office, Sherrill issued two executive orders acting on that promise, directing the state to make funds available to freeze rates and declaring a state of emergency to ease the way to building more generation.
Included in that first order was a review of utility business models to be carried out by state regulators. What that review will entail is now coming into focus.
On Wednesday, the New Jersey Board of Public Utilities issued a statement announcing that it will look specifically at “whether New Jersey’s century-old utility business model — one that rewards electric distribution companies (EDCs) for capital spending even when cheaper alternatives exist — should be replaced with a framework tied to performance, affordability, and long-term cost stability.” In case anyone was still ambiguous as to what the outcome of said study might be, the board added that it is “expected to drive the most significant restructuring of utility regulation in New Jersey in decades.”
The current system, the board’s president Christine Guhl-Savoy said at a hearing Thursday, “creates a structural incentive to favor capital intensive solutions, even when lower costs, non-wires or demand side alternatives may be available.”
This structure, she said, could help explain why “over the past decade, electric delivery charges in New Jersey have risen steadily.” Within the service territory of PSEG, one of the four major New Jersey utilities, distribution charges alone have risen from $19.24 per month in January 2020 (as far back as the Heatmap-MIT data goes) to $21.84 as of April, while transmission charges have risen from around $20 to just over $29 per month. Many critics of the utility business model point to high levels of local grid spending on distribution as a way that utilities pad their earnings with returns harvested from ratepayers.
In the system regulators explored at the hearing, new projects would get a more skeptical look and ratepayers payouts would be partially determined by utilities hitting pre-defined service goals. NJBPU executive director Bob Brabston also indicated that the review process would take a close look at utilities’ regulated returns on equity — echoing his neighbor across the Delaware River, Pennsylvania Governor Josh Shapiro, who wrote in a letter to his state’s utilities earlier this week that these returns must be “transparent” and “justifiable,” and no longer be based on “educated guesses.”
“We want to make sure that the actual cost of equity and the returns on equity are close,” Brabston said Thursday. “We don’t want there to be a significant gap between the cost of equity that you all experience and the returns that the agencies that the agency awards.”
Meanwhile, in Valley Forge, Pennsylvania, the framework within which New Jersey’s utilities exist is coming in for its own examination.
PJM Interconnection — the nation’s largest electricity market, which covers not just Pennsylvania and New Jersey but also part or all of 11 other states — released an almost 70-page paper Wednesday, in which the organization’s president David Mills wrote that “the current situation is not tenable.”
PJM has been the poster child for a host of issues plaguing the electricity markets across the country, including fast-rising prices, a failure to quickly bring on new generation, and an inability to assure the market’s preferred level of reserve reliability. This set of challenges, Mills said in the paper’s introduction, “reflects something more fundamental than a design that needs recalibration.” Instead, PJM must consider “whether the foundational assumptions of the market remain valid – and if not, what a valid set of assumptions would require.”
The problem with the electricity market, he argued, can be solved by more markets. Right now, when prices shoot up, governments intervene with price caps, suppressing the market signal necessary to bring on sufficient generation that would bring down prices.
To replace that system, the paper proposes three possible models. The first, which it calls “Stabilized Markets,” would allow capacity to be procured for several years at a time outside of the current auction system, so that utilities could make sure their basic needs were covered before they go into the annual auctions. This would provide long term security for new investment.
The second path would be a more fundamental reform. This “Differential Reliability” approach would do away with the “shared reliability compact,” under which all loads must be served by the system at all times. Instead, PJM would “develop the operational and commercial framework to explicitly differentiate reliability,” incentivizing approaches like bring your own generation or curtailing power for new large sources of demand.
The third path is an “Energy Market Transition,” which might also be called the “Texas option.” Following this path, the capacity market would shrink as a portion of revenues earned by generators, and more revenue would come from real-time or near-real-time electricity sales.
While this path isn’t “full Texas” (ERCOT doesn’t have a capacity market at all), it would mean allowing for higher prices for energy in real-time, a.k.a. “scarcity pricing” which is arguably the defining feature of the ERCOT system (though even that was scaled back when prices got too high).
“The choices embedded in these paths involve genuine trade-offs, and those trade-offs affect different stakeholders uniquely,” the paper says.If PJM has learned anything in the past few years, it’s that it doesn’t get to make decisions on its own. Those stakeholders will get their say, one way or another.
Big fundraises for Nyobolt and Skeleton Technologies, plus more of the week’s biggest money moves.
Following a quiet week for new deals, the industry is back at it with a bunch of capital flowing into some of the industry’s most active areas. My colleague Alexander C. Kaufman already told you about one of the more buzzworthy announcements from data center-land in Wednesday’s AM newsletter: Wave energy startup Panthalassa raised $140 million in a round led by Peter Thiel to “perform AI inference computing at sea” using nodes powered by the ocean’s waves.
This week also saw fresh funding for more conventional data center infrastructure, as Nyobolt and Skeleton Technologies both announced later-stage rounds for data center backup power solutions. Meanwhile, it turns out Redwood Materials is not the only company bringing in significant capital for second-life EV battery systems — Moment Energy just raised $40 million to pursue a similar approach. Elsewhere, investors backed an effort to rebuild domestic magnesium production, and, in a glimmer of hope for a sector on the outs, gave a boost to green cement startup Terra CO2.
Cambridge-based startup Nyobolt has become the latest battery company to reach a $1 billion valuation, with its expansion into the data center market helping fuel excitement around its tech. Spun out of University of Cambridge research in 2019, the company develops ultra-fast-charging batteries based on a modified lithium-ion chemistry. Its core innovation is an anode made from niobium tungsten oxide, which Nyobolt says enables its batteries to charge to 80% in less than five minutes, with a cycle life that’s 10 times longer than conventional lithium-ion, all without the risk of fire.
The company has now raised a $60 Series C, following what it describes as a period of “rapid commercial momentum,” with revenue increasing five-fold year-over-year as customers in the robotics and data center industry piled in. Symbotic, an autonomous robotics company and existing customer, led the latest round. While Symbotic previously relied on supercapacitors to power its robots, Nyobolt’s says its batteries provide six times more energy capacity in a lighter package, allowing its warehouse robots to work for retailers like Walgreens, Target, and Kroger around the clock.
Now the startup is targeting data center customers too, positioning its tech as a fast-acting fix for the sudden power surges common to large-scale artificial intelligence workloads, as well as a temporary backup power solution for outages. While it has no confirmed domestic data center customers to date, it does have a nonbinding agreement with the Indian state of Rajasthan to deploy over 100 megawatts of off-grid AI data center and power management infrastructure, part of a broader push to expand its presence across the country.
Notably, the press release made no mention of plans to sell its tech to electric vehicle automakers, though this appears to have been a central focus previously. As recently as last summer, executive vice president Ramesh Narasimhan told the BBC that he hoped Nyobolt’s batteries would “transform the experience of owning an EV.” But while its tech does enable extremely fast charging, its underlying chemistry is not optimized for long-range driving. A sports car built to test the company’s batteries had just a 155 mile range. So like many of its climate tech peers, the company appears to be betting that data centers now represent a more reliable opportunity.
This week brought additional news from another European player aiming to smooth out data center power surges. Estonia-based supercapacitor startup Skeleton Technologies raised $39 million in what it describes as the first close of a pre-IPO funding round, with a U.S. listing planned for next year. Its core tech is built around a “curved graphene” structure, which the company likens to a crumpled sheet of paper with a high surface area. The graphene’s many exposed surfaces and edges allows it to hold more electric charge, which Skeleton says delivers a 72% improvement in energy density.
Like Nyobolt, Skeleton says its tech offers faster response times and longer cycle life. But supercapacitors are a fundamentally different technology than Nyobolt’s modified lithium-ion solution. Though they offer near-instantaneous response times, they store very little energy — just enough to smooth out microsecond power spikes in GPU workloads. Nyobolt’s batteries, by contrast, aim not only to smooth out data center power spikes, but also to deliver about 90 seconds of backup power in the case of an outage, before a generator or other backup source kicks in.
Skeleton is already mass-producing supercapacitors in Germany and delivering to unnamed “major U.S. hyperscalers for AI infrastructure.” It’s also making moves to expand its U.S. footprint ahead of its pending IPO, opening an engineering facility in Houston and aiming to begin domestic manufacturing of AI data center solutions in the first half of this year.
Last year brought a wave of new climate tech coalitions, with one of the most ambitious efforts known as the All Aboard Coalition. This group of venture firms is targeting the investment gap known as the missing middle, which falls between early-stage venture rounds and infrastructure funding. The model is relatively mechanical: When three or more member firms participate in a later-stage round for a company, the coalition automatically coinvests out of its own fund, matching the members’ combined contribution.
The group made its first investment in January, supporting the AI-powered geothermal exploration and development company Zanskar’s Series C round. This week, it announced its second: a $22 million commitment to low-carbon cement startup Terra CO2, bringing the company’s Series B total to $147 million. Cement production accounts for roughly 8% of global emissions, a figure Terra aims to shrink by making so-called "supplementary cementitious materials” — which can partially displace traditional cement in concrete mixes — from abundant silicate rocks. By grinding and thermally processing these rocks into a glassy powder, Terra’s product mimics the properties of conventional cement. The company says it can replace up to 50% of the cement in typical concrete mixes, lowering associated emissions by as much as 70%.
The new funding will help Terra build its first commercial-scale plant in Texas, exactly the type of first-of-a-kind project that the coalition was designed to support. But the scale of this challenge remains clear. As noted in ImpactAlpha’s coverage, the coalition has raised just $100 million toward its goal of a $300 million fund — already a relatively modest goal considering the capital intensity of novel infrastructure projects. Bloomberg previously reported that the group aimed to raise the full amount by the end of October 2025, raising questions about the willingness of LPs to bet on projects at this crucial but capital-intensive juncture.
When I think about repurposing used electric vehicle batteries for stationary storage, I think of battery recycling giant Redwood Materials, which raised a $425 million Series E in January after moving aggressively into this promising market. But while Redwood’s well-established recycling business certainly provides it with the largest pipeline of used batteries, it’s far from the only company pursuing this business model. A smaller player with a largely similar approach underscored that this week, when it announced a $40 million Series B to scale its gigafactory in Texas and expand its facilities in British Columbia.
That’s Moment Energy, which focuses on using second-life EV batteries to power commercial and industrial sites such as data centers, hospitals, and factories. Like Redwood, it relies on proprietary software to aggregate battery packs with myriad chemistries and design specs into coordinated grid-scale systems. What the company sees as its critical differentiator, however, is its safety standards. Moment has achieved UL certification, a key safety benchmark that it says others in the industry have yet to meet.
In a shot at its competitors, the company described itself in a press release as the “only provider proven capable of deploying second-life battery storage systems in the built environment without special dispensations or regulatory loopholes.” While Moment never names names, Redwood’s first commercial-scale system sits on its own private land in an open air setting, where certification is arguably unnecessary. “What most other second life [battery] companies are now trying to say is, let’s just lobby to make second life UL certification easier, because it is impossible to get UL certification, as it stands,” the company’s CEO, Edward Chiang, told TechCrunch. “But at Moment, we say that’s not true. We got it.”
As I wrote last September, it’s a good time to be a critical minerals startup, because as you may have heard, “critical minerals are the new oil.” These materials sit at the center of modern energy infrastructure — batteries, magnets, photovoltaic cells, and electrical wiring, to name just a few uses — plus their supply is concentrated in geopolitically tense regions and subject to extreme price volatility. It also certainly doesn’t hurt that the Trump administration loves them and wants to mine and refine way more of them in the U.S.
The latest beneficiary of this enthusiasm is Magrathea, which this week raised a $24 million Series A to build what it says will be the only new magnesium smelter in the U.S., in Arkansas. The company has now raised over $100 million in total, including a $28 million grant from the Department of Defense. Its approach relies on an electrolysis-based process that’s able to extract pure magnesium from seawater and brines, which it positions as a cleaner, cheaper alternative to the high-heat, emission-intensive method that China uses to produce most of the world’s magnesium today.
The U.S. military has taken note of this potential new domestic supply. Magrathea’s 2022 seed round coincided with Russia’s invasion of Ukraine, as the military looked to scale domestic defense tech supply chains. Magnesium alloys are often used to help reduce weight in EV components, a benefit equally applicable to military helicopters, drones, and next-generation fighter jets. So while these defense applications represent somewhat of a pivot from the startup’s initial focus, a greener fighter jet is still better than a dirty fighter jet.