You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
The surge in electricity demand from data centers is making innovation a necessity.

Electric utilities aren’t exactly known as innovators. Until recently, that caution seemed perfectly logical — arguably even preferable. If the entity responsible for keeping the lights on and critical services running decides to try out some shiny new tech that fails, heating, cooling, medical equipment, and emergency systems will all trip offline. People could die.
“It’s a very conservative culture for all the right reasons,” Pradeep Tagare, a vice president at the utility National Grid and the head of its corporate venture fund, National Grid Partners, told me. “You really can’t follow the Silicon Valley mantra of move fast, break things. You are not allowed to break things, period.”
But with artificial intelligence-driven load growth booming, customer bills climbing, and the interconnection queue stubbornly backlogged, utilities now face little choice but to do things differently. The West Coast’s Pacific Gas and Electric Company now has a dedicated grid-innovation team of about 60 people; North Carolina-based utility Duke Energy operates an emerging technologies office; and National Grid, which serves U.S. customers in the Northeast, has invested in about 50 startups to date. Some 64% of utilities have expanded their innovation budgets in the past year, according to research by National Grid Partners, while 42% reported working with startups in some capacity.
The innovators on these teams are well aware that their reputation precedes them when it comes to bringing novel tech to market — and not in a flattering way. “I think historically we’ve done a poor job partnering with too many companies and spreading ourselves thin,” Quinn Nakayama, the senior director of grid research, innovation, and development at PG&E, told me. That’s led to a pattern known as “death by pilot,” in which utilities trial many promising solutions but are too risk-averse, cost-conscious, and slow-moving to deploy them, leaving the companies with no natural customers.
It doesn’t help that regulators such as public utilities commissions understandably require new investments to meet a strict “prudency” standard, proving that they can achieve the desired result at the lowest reasonable cost consistent with good practices. Yet this can be a high bar for tech that’s yet untested at scale. And because investor-owned utilities earn a guaranteed rate of return on approved infrastructure investments, they’re incentivized to pursue capital-intensive projects over smaller efficiency improvements. Freedom from the pressure of a competitive market has also traditionally meant freedom from the pressure to innovate.
But that’s changing.
To help bridge at least some of these divides, National Grid Partners set up a business development unit specifically for startups. “Their sole job is to work with our portfolio companies, work with our business units, and make sure that these things get deployed,” Tagare told me. Over 80% of the firm’s portfolio companies, he said, now have tie-ups of some sort with National Grid — be that a pilot or a long-term deployment — while “many” have secured multi-million dollar contracts with the utility.
While Tagare said that National Grid Partners is already reaping the benefits from investments in AI to streamline internal operations and improve critical services, hardware is slower to get to market. The startups in this category run the gamut from immediately deployable technologies to those still five or more years from commercialization. LineVision, a startup operating across parts of National Grid’s service territories in upstate New York and the U.K., is a prime example of the former. Its systems monitor the capacity of transmission lines in real-time via sensors and environmental data analytics, thus allowing utilities to safely push 20% to 30% more power through the wires as conditions permit.
There’s also TS Conductor, a materials science startup that’s developed a novel conductor wire with a lightweight carbon core and aluminum coating that can double or triple a line’s capacity without building new towers and poles. It’s a few years from achieving the technical and safety validation necessary to become an approved supplier for National Grid. Then five or more years down the line, National Grid Partners hopes to be able to deploy the startup Veir’s superconductors, which promise to boost transmission capacity five- to tenfold with materials that carry electricity with virtually zero resistance. But because this requires cooling the lines to cryogenic temperatures — and the bulky insulation and cooling systems need to do so — it necessitates a major infrastructure overhaul.
PG&E, for its part, is pursuing similar efficiency goals as it trials tech from startups including Heimdell Power and Smart Wires, which aim to squeeze more power out of the utility’s existing assets. But because the utility operates in California — the U.S. leader in EV adoption, with strong incentives for all types of home electrification — it’s also focused on solutions at the grid edge, where the distribution network meets customer-side assets like smart meters and EV charging infrastructure.
For example, the utility has a partnership with smart electric panel maker Span, which allows customers to adopt electric appliances such as heat pumps and EV chargers without the need for expensive electrical upgrades. Span’s device connects directly to a home’s existing electric panel, enabling PG&E to monitor and adjust electricity use in real time to prevent the panel from overloading while letting customers determine what devices to prioritize powering. Another partnership with smart infrastructure company Itron has similar aims — allowing customers to get EV fast chargers without a panel upgrade, with the company’s smart meters automatically adjusting charging speed based on panel limits and local grid conditions.
Of course, it’s natural to question how motivated investor-owned utilities really are to deploy this type of efficiency tech — after all, the likes of PG&E and National Grid make money by undertaking large infrastructure projects, not by finding clever means of avoiding them. And while both Nakayama and Tagare can’t deny what appears to be a fundamental misalignment of incentives, they both argue that there’s so much infrastructure investment needed — more than they can handle — that the friction is a non-issue.
“We have capital coming out of our ears,” Nakayama told me. Given that, he said, PG&E’s job is to accelerate interconnection for all types of loads, which will bring in revenue to offset the cost of the upgrades and thus lower customer rates. Tagare agreed.
“At least for the next — pick a number, five, seven, 10 years — I don’t see any of this slowing down,” he said.
And yet despite all that capital flow, PG&E still carries billions of dollars in wildfire-related financial obligations after its faulty equipment was found liable for sparking a number of blazes in Northern California in 2017 and 2018. The resulting legal claims drove the utility into bankruptcy in 2019, before it restructured and reemerged the following year. But the threat of wildfires in its service territory still looms large, which Nakayama said limits the company’s ability to allocate funds toward the basic poles and wires upgrades that are so crucial for easing the congested interconnection queue and bringing new load online.
Nakayama wants California’s legislature and courts to revise rules that make utilities strictly liable for wildfires caused by their equipment, even when all safety and mitigation procedures were followed. “In order for me to feel comfortable moving some of my investments out of wildfire into other areas of our business in a more accelerated fashion, I have to know that if I make the prudent investments for wildfire risk mitigation, I’m not going to be held liable for everything in my system,” he told me.
And while wildfire prevention itself is an area rich with technical innovation and a central focus of the utility’s startup ecosystem, Nakayama emphasizes that PG&E has a host of additional priorities to consider. “We need [virtual power plants]. We need new technologies. We need new investments. We need new capital. We need new wildfire-related liability,” he told me.
Utilities — especially his — rarely get seen as the good guys in this story. “I know that PGE gets vilified a lot,” Nakayama acknowledged. But he and his colleagues are “almost desperate to try to figure out how to bring down rates,” he promised.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Fights over AI-related developments outnumber those over wind farms in the Heatmap Pro database.
Local data center conflicts in the U.S. now outnumber clashes over wind farms.
More than 270 data centers have faced opposition across the country compared to 258 onshore and offshore wind projects, according to a review of data collected by Heatmap Pro. Data center battles only recently overtook wind turbines, driven by the sudden spike in backlash to data center development over the past year. It’s indicative of how the intensity of the angst over big tech infrastructure is surging past current and historic malaise against wind.
Battles over solar projects have still occurred far more often than fights over data centers — nearly twice as many times, per the data. But in terms of megawatts, the sheer amount of data center demand that has been opposed nearly equals that of solar: more than 51 gigawatts.
Taken together, these numbers describe the tremendous power involved in the data center wars, which is now comparable to the entire national fight over renewable energy. One side of the brawl is demand, the other supply. If this trend continues at this pace, it’s possible the scale of tension over data centers could one day usurp what we’ve been tracking for both solar and wind combined.
The enhanced geothermal darling is spending big on capex, but its shares will be structured more like a software company’s.
Fervo, the enhanced geothermal company that uses hydraulic fracturing techniques to drill thousands of feet into the Earth to find pockets of heat to tap for geothermal power, is going public.
The Houston-based company was founded in 2017 and has been a longtime favorite of investors, government officials, and the media (not to mention Heatmap’s hand-selected group of climate tech insiders) for its promise of producing 24/7 clean power using tools, techniques, and personnel borrowed from the oil and gas industry.
After much speculation as to when it would go public, Fervo filed the registration document for its initial public offering on Friday evening. Here’s what we were able to glean about the company, its business, and the geothermal industry from the filing.
The main theme of the document, known as an S-1, is the immense potential enhanced geothermal — and, thus, Fervo — has.
The company says that its Cape Station site in Utah, where it’s currently developing its flagship power plants, had “4.3 gigawatts of capacity potential” alone. That’s more than the 3.8 gigawatts of conventional geothermal capacity currently on the grid. Enhanced geothermal technology, otherwise known as EGS, “has the potential to make geothermal generation as ubiquitous as solar generation is in the U.S. today,” the company projects. (There’s about 280 gigawatts of installed solar capacity currently in the U.S., according to the Solar Energy Industries Association) “A broader subset of our reviewed leases represents over 40 gigawatts” of capacity, the document goes on.
Like all investor pitches, the S-1 features some eye-popping “total addressable market” figures. Citing analysis by the consulting firm Rystad, the document says that if there’s a sufficient shortfall in capacity due to retiring power plants (98 gigawatts by 2035), the annual market for enhanced geothermal would be approximately $70 billion by 2035, and that this would represent some $2.1 trillion in revenue potential over 30 years.
The company is already producing 3 megawatts at its Nevada Project Red site for the Nevada grid as part of a deal with Google. It also expects to begin generating power from the Cape Station site “by late 2026,” according to the filing, and get up to 100 megawatts “by early 2027.” In total, Fervo has “658 megawatts of binding power purchase agreements,” which it says represents ”approximately $7.2 billion in potential revenue backlog.”
Beyond that, Fervo says it has 2.6 gigawatts “in advanced development,” and “over 38 gigawatts” in “early-stage development,” where it’s still doing feasibility studies to “validate and confirm the path toward commercial development.”
Fervo says that the energy produced from its Cape Station facility will come in at around $7,000 per kilowatt. That’s already cheaper than “traditional and small modular nuclear power,” which the Department of Energy has estimated costs $6,000 to $10,000 per kilowatt, the filing says. Fervo is aiming to get the total project costs down to $3,000 per kilowatt, at which point it says it would outcompete natural gas without any of the price volatility due to fuel costs going up and down.
But Fervo’s upfront spending is still immense. Fervo says that it expects some $1.2 billion in capital expenditure this year, of which only $125 million is going toward the first phase of its Cape Station project, which it has said would deliver 100 megawatts of power. (Meanwhile, the $940 million it expects to spend on the second phase, which is due to be 400 megawatts, is mostly unfunded.) The company says the public offering will fund “project-level capital expenditures,” as well as land holdings and general corporate expenditures.
Google comes up some 36 times in the document, most times in reference to the “Geothermal Framework Agreement” Fervo signed with the hyperscaler this past March. The S-1 describes the deal as a “3-gigawatt framework agreement … to advance and structure potential power offtake opportunities for current and planned data centers in both grid-connected and alternative energy solutions.” This deal, the company says, “establishes a structured process for the development of geothermal projects across specified regions of the United States,” and could involve the offtake by Google of up to 3 gigawatts of Fervo-generated electricity by the end of 2033.
What the framework is not is a power purchase agreement. One of the risk factors Fervo lists in the IPO document says, “The GFA is a non-binding agreement, and does not obligate Google to purchase power from us.” Instead, it is “a binding framework under which we may propose geothermal development projects to Google, but it does not obligate Google to accept any project, execute any power purchase agreement or provide us with any project financing.”
The agreement also places limits on Fervo, including from whom it can accept investment or financing. (The deal outlines a “broad category of entities defined as competitors,” which are all no-nos.) Overall, the company says, the arrangement gives Google “significant priority over our near-term development pipeline and may limit our flexibility to pursue alternative commercial, strategic, or financing arrangements that would otherwise be available to us.”
Upon going public, the company will have two shares of stock: Class A shares available to the public, and Class B shares owned by its founders, chief executive officer Tim Latimer, and chief technology officer Jack Norbeck. These Class B shares will have 40 times the voting rights of the class A shares and will allow Latimer and Norbeck to “collectively continue to control a significant percentage of the combined voting power of our common stock and therefore are able to control all matters submitted to our stockholders for approval.”
These arrangements are familiar with venture-backed, founder-led software companies. Alphabet and Meta are the most prominent examples of large, publicly traded companies that are under the effective control of their founders thanks to dual class share structures. Tesla, rather famously, does not have a dual class share structure, which is why CEO Elon Musk convinced his board to award him more shares so that he would maintain a high degree of influence over the company.
While other technology companies such as Stripe pile up billions in revenue without any near term prospects of going public, Fervo largely has spending to report on its income statement.
In 2025, the company reported just $138,000 in revenues with a $58 million net loss; that’s compared to a $41 million net loss in 2024. The revenues were “ancillary fees associated with rights to geothermal production at Project Red,” the company said. “This type of revenue is not expected to be significant to our long-term revenue generation, as we have not yet commenced large-scale commercial operations.”
And there’s more spending to come.
Fervo expects that the second phase of its Cape Station project will “require approximately $2.2 billion in capital expenditures through 2028,” which it hopes to pay for with project-level financing.
Fervo said it is “continuing to evaluate the effect of the OBBB” — that is, the One Big Beautiful Bill Act, which slashed or curtailed tax credits for clean energy companies — and that it wasn’t able to “reasonably” estimate the effect on its financial statements by the end of last year. The company does say, however, that it “may benefit from ITCs and PTCs (including the energy community and domestic content bonuses available under the ITC and PTC, in certain circumstances) with respect to qualifying renewable energy projects,” referring to the investment and production tax credits, which acquired a strict set of eligibility rules under OBBBA. It cautioned that the current guidance regarding tax credit eligibility is “subject to a number of uncertainties,” and that “there can be no assurance that the IRS will agree with our approach to determining eligibility for ITCs and PTCs in the event of an audit.”
The company also disclosed that earlier this month, it reached a deal with Liberty Mutual, the insurance company “to sell and transfer tax credits generated at Cape Station Phase I,” taking advantage of a provision of the law that allows credits to be sold to other entities with tax liability, and not just harvested by investors in the project.
The COVID-era political divide is still having ripple effects.
Six years ago this month, the Centers for Disease Control and Prevention began advising that even healthy individuals to wear face coverings to protect themselves against the spread of what we were then still calling the “novel coronavirus.” Mask debates, mandates, bans, and confrontations followed. To this day, in the right parts of the country, covering your face will still earn you dirty looks, or worse.
If there were ever another year to have an N95 on hand, though, it’s this one. This winter was the warmest on record in nine U.S. states; Oregon, Colorado, Utah, and Montana have also recorded some of their lowest snowpacks since record-keeping began. That cues up the landscape in the West for “above normal significant fire potential,” in the words of the National Interagency Fire Center, which issues predictive outlooks for the season ahead. And it’s not just the West: the 642,000-acre Morrill grass fire, which ignited in early March, was the largest in Nebraska’s history, while exceptional drought conditions stretching from East Texas through Florida have set the stage for “well above normal fire activity” heading into the spring lightning season. As of the end of March, wildfires have already burned more than 1.6 million acres in the U.S., or 231% of the previous 10-year average.
“Air pollution is the most significant toxic environmental exposure that the average person is ever subjected to, and wildfire smoke in particular is probably the most toxic type of air pollution [they’re] ever exposed to,” Brian Moench, the president at Utah Physicians for a Healthy Environment, a nonprofit clean-air advocacy group, told me.
Our understanding of just how dangerous that smoke is grows by the year. After having their grant pulled by the Trump administration, researchers at the University of California, Davis Health and UCLA persisted in publishing a report this winter reviewing more than 8.6 million births in California and demonstrating a link between exposure to wood smoke during pregnancy and the increased likelihood of autism. Another report, also published this winter by researchers from UCLA, estimated that the particulate matter from wildfire smoke is responsible for nearly 25,000 deaths per year in the United States, with no safe threshold for exposure.
“If a person is in a circumstance where they really can’t avoid wildfire smoke,” Moench added, “they absolutely should be doing everything they can to protect themselves.”
As public health offices around the country will tell you, one of the best ways to do just that is by donning an effective mask. N95 respirators specifically are about 95% effective at protecting the wearer against the dangerous particulates in wildfire smoke (although not gases or asbestos). Though not recommended by public health departments due to their comparative ineffectiveness, even surgical and cloth masks can offer limited particulate protection of about 68% and 33%, respectively.
But you have to actually wear them. After the Los Angeles fires in early 2025, health officials warned that exposure to toxic ash and dust remained a threat even after the air quality index returned to safe levels; one public health official who spoke to The New York Times recommended wearing a face mask for at least a month after the fires, a duration likely to feel interminable to all but the most cautious of people. “I think there’s a reluctance on the part of a lot of people to wear masks based not on anything other than they don’t want to make a political statement with their public outings,” Moench said. “I think there are a lot of people who just want to shy away from the controversy that they represent, irrespective of whether or not it’s a good idea.”
Moench has first-hand experience with the frustrating experience of promoting lung health in the polarized, post-COVID world of masking. Last year, Utah lawmakers floated a statewide mask ban with exceptions only for Halloween and masquerades — but not for legitimate health concerns such as poor air quality due to wildfire smoke. Though the ban was swiftly shot down, in part due to the outcry from disability advocates and environmental health groups, including Physicians for a Healthy Environment, the fact that the legislature floated it at all underscores how masks remain divisive, even years after mandates ended.
Many in public health have approached post-COVID messaging around masking by promoting scientific facts. Bev Stewart, the regional director of health initiatives at the American Lung Association of the Mountain Pacific, told me that in her experience, “It’s rare that somebody would say, ‘I would never, under any circumstance, wear a mask.’” She called the process of trying to reach skeptics a “conversation,” noting that there tends to be “a large misunderstanding about how lungs work” — namely, that masks offer protections that extend beyond the associations with the pandemic.
“Many types of air quality concerns could be mitigated with masks,” Stewart told me. “Sometimes we’re just thinking too narrowly about one specific instance and forgetting the forest for the trees.”
Others I spoke to, though, were doubtful that the populations who are most resistant to mask-wearing could be reached through facts alone. A portion of the country has “lost all respect for empirical evidence, facts, and science — virtually everything that modern civilization was based upon,” Moench said.
Jonas Kaplan, an associate professor of psychology at the University of Southern California, has put numbers to Moench’s conjecture. During the COVID pandemic, Kaplan studied how messaging can reach anti-maskers, discovering that when “information about masks was framed in terms of pure science, there was no significant reduction in anti-mask beliefs or change in mask-wearing behavior.”
Kaplan told me that a lot of the resistance in the anti-masking community comes down to, “What will people in public think of me? What would my friends think of me?” The most effective messages, he’s found, are those that speak to in-group values rather than presenting straight facts. “It wasn’t like, ‘Studies show that this is safe …’” broke through with the skeptics, Kaplan said. “It was more about emphasizing, ‘This is important, and we should care about it.’”
Science, though, does still have a vital role to play. Though we already have a better understanding of the impacts of smoke exposure than we did even a few years ago, more research is needed into its long-term effects. That will also give us greater clarity into how to best protect the more than 25 million Americans who are exposed to wildfire smoke every year — both physically, via better masks and air filters, as well as through better public health messaging.
“Smoke by itself — we know what’s in it, and we know you don’t want to breathe it in,” Emily Fischer, a leading expert on air pollution and a researcher and professor at Colorado State University, told me. “We also know that there are protective actions that families can prepare for, and do their best to take.”
Unfortunately, under the Trump administration, the Environmental Protection Agency, the National Oceanic and Atmospheric Administration, and the National Science Foundation, which had previously led research in the area, have drastically reduced their funding. Just this week, The Hill reported that NOAA has cut off grant funding to the University of Colorado’s Cooperative Institute for Research in Environmental Sciences, which, in addition to research into greenhouse gases, has extensively studied wildfire-related air pollution.
Fischer has been affected, too. “My team has had grants terminated related to air quality and protecting public health, and that’s really sad because the smoke doesn’t care if you’re a kid, if you’re elderly, or if you live in a red or blue state,” she said. “Families really need to think right now about how to protect themselves and their loved ones” against the smoke ahead, she told me.