You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
Maybe you’ve never heard of it. Maybe you know it too well. But to a certain type of clean energy wonk, it amounts to perhaps the three most dreaded words in climate policy: the interconnection queue.
The queue is the process by which utilities decide which wind and solar farms get to hook up to the power grid in the United States. Across much of the country, it has become so badly broken and clogged that it can take more than a decade for a given project to navigate.
On this week’s episode of Shift Key, Jesse and Rob speak with two experts about how to understand — and how to fix — what is perhaps the biggest obstacle to deploying more renewables on the U.S. power grid. Tyler Norris is a doctoral student at Duke University’s Nicholas School of the Environment. He was formerly vice president of development at Cypress Creek Renewables, and he served on North Carolina Governor Roy Cooper’s Carbon Policy Working Group. Claire Wayner is a senior associate at RMI’s carbon-free electricity program, where she works on the clean and competitive grids team. Shift Key is hosted by Robinson Meyer, the founding executive editor of Heatmap, and Jesse Jenkins, a professor of energy systems engineering at Princeton University.
Subscribe to “Shift Key” and find this episode on Apple Podcasts, Spotify, Amazon, or wherever you get your podcasts.
You can also add the show’s RSS feed to your podcast app to follow us directly.
Here is an excerpt from our conversation:
Robinson Meyer: Can I interject and just ask why, over the past decade, the interconnection queue got much longer — but also over the past decade, 15 years, the U.S. grid did change in character and in fuel type a lot, right? We went from burning a lot of coal to a lot of natural gas. And that transition is often cited as one of the model transitions, one of the few energy transitions to happen globally that happened at the speed with which we would need to decarbonize. Obviously, switching coal to gas is not decarbonizing, but it is a model — it happened fast enough that it is a good model for what decarbonizing would look like in order to meet climate goals.
Evidently, that did not run into these kind of same interconnection queue problems. Why is that? Is that because we were swapping in within individual power plants? We were just changing the furnace from a coal furnace to a gas furnace? Is that because these were larger projects and so it didn’t back up in the queue in the same way that a lot of smaller solar or wind farms do?
Claire Wayner: I would say all the reasons you just gave are valid, yeah. The coal to gas transition involved, likely, a lot of similar geographic locations. With wind and solar, we’re seeing them wanting to build on the grid and in a lot of cases in new, rather remote locations that are going to require new types of grid upgrades that the coal to gas transition just doesn’t have.
Jesse Jenkins: Maybe it is — to use a metaphor here — it’s a little bit like traffic congestion. If you add a generator to the grid, it’s trying to ship its power through the grid, and that decision to add your power mix to the grid combines with everyone else that’s also generating and consuming power to drive traffic jams or congestion in different parts of the grid, just like your decision to hop in the car and drive to work or to go into the city for the weekend to see a show or whatever you’re doing. It’s not just your decision. It’s everyone’s combined decisions that affects travel times on the grid.
Now, the big difference between the grid and travel on roads or most other forms of networks we’re used to is that you don’t get to choose which path to go down. If you’re sending electricity to the grid, electricity flows with physics down the path of least resistance or impedance, which is the alternating current equivalent of resistance. And so it’s a lot more like rivers flowing downhill from gravity, right? You don’t get to choose which branch of the river you go down. It’s just, you know, gravity will take you. And so you adding your power flows to the grid creates complicated flows based on the physics of this mesh network that spans a continent and interacts with everyone else on the grid.
And so when you’re going from probably a few dozen large natural gas generators added that operate very similarly to the plants that they’re replacing to hundreds of gigawatts across thousands of projects scattered all over the grid with very complicated generation profiles because they’re weather-dependent renewables, it’s just a completely different challenge for the utilities.
So the process that the regional grid operators developed in the 2000s, when they were restructuring and taking over that role of regional grid operator, it’s just not fit for purpose at all for what we face today. And I want to highlight another thing you mentioned, which is the software piece of it, too. These processes, they are using software and corporate processes that were also developed 10 or 20 years ago. And we all know that software and computing techniques have gotten quite a bit better over a decade or two. And rarely have utilities and grid operators really kept pace with those capabilities.
Wayner: Can I just say, I’ve heard that in some regions, interconnection consists of still sending back and forth Excel files. To Tyler’s point earlier that we only just now are getting data on the interconnection queue nationwide and how it stands, that’s one challenge that developers are facing is a lack of data transparency and rapid processing from the transmission providers and the grid operators.
And so, to use an analogy that my colleague Sarah Toth uses a lot, which I really love: Imagine if we had a Domino’s pizza tracker for the interconnection queue, and that developers could just log on and see how their projects are doing in many, if not most regions. They don’t even have that visibility. They don’t know when their pizza is going to get delivered, or if it’s in the oven.
This episode of Shift Key is sponsored by …
Watershed’s climate data engine helps companies measure and reduce their emissions, turning the data they already have into an audit-ready carbon footprint backed by the latest climate science. Get the sustainability data you need in weeks, not months. Learn more at watershed.com.
As a global leader in PV and ESS solutions, Sungrow invests heavily in research and development, constantly pushing the boundaries of solar and battery inverter technology. Discover why Sungrow is the essential component of the clean energy transition by visiting sungrowpower.com.
Antenna Group helps you connect with customers, policymakers, investors, and strategic partners to influence markets and accelerate adoption. Visit antennagroup.com to learn more.
Music for Shift Key is by Adam Kromelow.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Elemental Impact, Breakthrough Energy, Speed & Scale, Stanford, Energy Innovation, and McKinsey are all partnering to form the “Climate Tech Atlas.”
The federal government has become an increasingly unreliable partner to climate tech innovators. Now venture capitalists, nonprofits, and academics are embracing a new plan to survive.
On Thursday, an interdisciplinary coalition — including Breakthrough Energy, McKinsey, and Stanford University’s Doerr School of Sustainability — unveiled the Climate Tech Atlas, a new plan to map out opportunities in the sector and define innovation imperatives critical to the energy transition.
The goal is to serve as a resource for stakeholders across the industry, drawing their focus toward the technological frontiers the alliance sees as the most viable pathways to economy-wide decarbonization. The idea is not to eliminate potential solutions, but rather “to enable the next generation of innovators, entrepreneurs, researchers, policymakers, and investors to really focus on where we felt there was the largest opportunity for exploration and for innovation to impact our path to net zero through the lens of technology,” Cooper Rinzler, a key collaborator on the initiative and a partner at the venture capital firm Breakthrough Energy Ventures, told me.
Other core contributors include the nonprofit investor Elemental Impact, John Doerr’s climate initiative Speed & Scale, and the policy think tank Energy Innovation. The Atlas has been a year in the making, Ryan Panchadsaram of Speed & Scale told me. “We’ve had maybe close to 20 to 30 working sessions with 80 different contributors, all focused on the big question of what innovations are needed to decarbonize our economy.”
The website, which launched today, lays out 24 opportunity areas across buildings, manufacturing, transportation, food, agriculture and nature, electricity, and greenhouse gas removal. Diving into “buildings,” for example, one can then drill down into an opportunity area such as “sustainable construction and design,” which lists three innovation imperatives: creating new design tools to improve materials efficiency and carbon intensity, improving building insulation and self-cooling, and industrializing construction to make it faster and more modular.
Then there are the moonshots — 39 in total, and two for this opportunity in particular. The first is developing carbon-negative building coatings and surface materials, and the second is inventing low-carbon building materials that can outperform steel and cement. It’s these types of moonshots, Rinzler told me, where much of the “residual uncertainty” and thus “opportunity for surprise” lies.
Each core collaborator, Panchadsaram said, naturally came into this exercise with their own internal lists and ideas about what types of tech and basic research were needed most. The idea, he told me, was to share “an open source version of what we each had.”
As Dawn Lippert, founder and CEO of Elemental Impact, put it to me, the Atlas “can help accelerate any conversation.” Her firm meets with over 1,000 entrepreneurs per year, she explained, on top of numerous philanthropists trying to figure out where to direct their capital. The Atlas can serve as a one-stop-shop to help them channel their efforts — and dollars — into the most investable and salient opportunities.
The same can be said for research priorities among university faculty, Charlotte Pera, the executive director of Stanford’s Sustainability Accelerator, told me. That then trickles down to help determine what classes, internships, and career paths students interested in the intersection of sustainability and technology ultimately choose.
The coalition members — and the project itself — speak to the prudence of this type of industry-wide level-setting amidst a chaotic political and economic environment. Referencing the accelerants Speed & Scale identifies as critical to achieving net-zero emissions — policy, grassroots and global movements, innovation, and investment — Panchadsaram told me that “when one is not performing in the way that you want, you have to lean in more into the others.”
These days, of course, it’s U.S. policy that’s falling short. “In this moment in time, at least domestically, innovation and investment is one that can start to fill in that gap,” he said.
This isn’t the first effort to meticulously map out where climate funding, innovation, and research efforts should be directed. Biden’s Department of Energy launched the Earthshots Initiative, which laid out innovation goals and pathways to scale for emergent technologies such as clean hydrogen, long-duration energy storage, and floating offshore wind. But while it’s safe to say that Trump isn’t pursuing the coordinated funding and research that Earthshots intended to catalyze, the private sector has a long and enthusiastic history with strategic mapping.
Breakthrough Energy, for example, had already pinpointed what it calls the “Five Grand Challenges” in reaching net-zero emissions: electricity, transportation, manufacturing, buildings, and agriculture. It then measures the “green premium” of specific technologies — that is, the added cost of doing a thing cleanly — to pinpoint what to prioritize for near-term deployment and where more research and development funding should be directed. Breakthrough's grand challenges closely mirror the sectors identified in the Atlas, which ultimately goes into far greater depth regarding specific subcategories.
Perhaps the pioneer of climate tech mapping is Kleiner Perkins, the storied venture capital firm, where Doerr was a longtime leader and currently serves as chairman; Panchadsaram is also an advisor there. During what investors often refer to as Clean Tech 1.0 — a boom-and-bust cycle that unfolded from roughly 2006 to 2012 — the firm created a “map of grand challenges.” While it appears to have no internet footprint today, in 2009, Bloomberg described it as a “chart of multicolored squares” tracking the firm’s investment across key climate technologies, with blank spots for tech with the potential to be viable — and investable — in the future.
Many of these opportunities failed to pay off, however. The 2008 financial crisis, the U.S. oil and natural gas boom, and slow development timelines for clean tech contributed to a number of high-profile failures, causing investors to sour on clean tech — a precedent the Atlas coalition would like to avoid.
These days, investors tend to tell me that Clean Tech 1.0 taught them to be realistic about long commercialization timelines for climate tech. Breakthrough Energy Ventures, for example, has funds with lengthy 20-year investment horizons. In a follow-up email, Rinzler also noted that even considering the current political landscape, “there’s a far more robust capital, corporate, and policy environment for climate tech than there was in the 2000s.” Now, he said, investors are more likely to consider the broader landscape across tech, finance, and policy when gauging whether a company can compete in the marketplace. And that often translates to a decreased reliance on government support.
“There are quite a few solutions that are embodied here that really don’t have an obligate dependence on policy in any way,” Rinzler told me. “You don’t have to care about climate to think that this is an amazing opportunity for an entrepreneur to come in and tackle a trillion-dollar industry with a pure profit incentive.”
The Atlas also seeks to offer a realistic perspective on its targets’ commercial maturity via a “Tech Category Index.” For example, the Atlas identifies seven technology categories relevant to the buildings sector: deconstruction, disposal and reuse, green materials, appliances, heating and cooling, smart buildings, and construction. While the first three are deemed “pilot” stage, the rest are “commercial.” More nascent technologies such as fusion, as well as many carbon dioxide removal methods are categorized as “lab” stage.
But the Atlas isn’t yet complete, its creators emphasized. Even now they’re contemplating ways to expand, based on what will provide the most value to the sector. “Is it more details on commercial status? Is it the companies that are working on it? Is it the researchers that are doing this in their lab?” Panchadsaram mused. “We are asking those questions right now.”
There’s even a form where citizen contributors can suggest new innovation imperatives and moonshots, or provide feedback on existing ones. “We do really hope that people, when they see this, collaborate on it, build on it, duplicate it, replicate it,” Panchadsaram told me. “This is truly a starting point.”
Zanskar’s second geothermal discovery is its first on untapped ground.
For the past five years or so, talk of geothermal energy has largely centered on “next-generation” or “enhanced” technologies, which make it possible to develop geothermal systems in areas without naturally occurring hot water reservoirs. But one geothermal exploration and development company, Zanskar, is betting that the scope and potential of conventional geothermal resources has been vastly underestimated — and that artificial intelligence holds the key to unlocking it.
Last year, Zanskar acquired an underperforming geothermal power plant in New Mexico. By combining exclusive data on the subsurface of the region with AI-driven analysis, the company identified a promising new drilling site, striking what has now become the most productive pumped geothermal well in the U.S. Today, the company is announcing its second reservoir discovery, this one at an undeveloped site in northern Nevada, which Zanskar is preparing to turn into a full-scale, 20-megawatt power plant by 2028.
“This is probably one of the biggest confirmed resources in geothermal in the last 10 years,” Zanskar’s cofounder and CEO Carl Hoiland told me. When we first connected back in August, he explained that since founding the company in 2019, he’s become increasingly convinced that conventional geothermal — which taps into naturally occurring reservoirs of hot water and steam — will be the linchpin of the industry’s growth. “We think the estimates of conventional potential that are now decades old just all need to be rewritten,” Hoiland told me. “This is a much larger opportunity than has been previously appreciated.”
The past decade has seen a lull in geothermal development in the U.S. as developers have found exploration costs prohibitively high, especially as solar and wind fall drastically in price. Most new projects have involved either the expansion of existing facilities or tapping areas with established resources, spurring geothermal startups such as Fervo Energy and Sage Geosystems to use next-generation technologies to unlock new areas for development.
But Hoiland told me that in many cases, conventional geothermal plants will prove to be the simplest, most cost-effective path to growth.
Zanskar’s new site, dubbed Pumpernickel, has long drawn interest from potential geothermal developers given that it’s home to a cluster of hot springs. But while both oil and gas companies and the federal government have drilled exploratory wells here intermittently since the 1970s, none hit hot enough temperatures for the reservoirs to be deemed commercially viable.
But Zanksar’s AI models — trained on everything from decades old geological and geophysical data sets to newer satellite and remote sensing databases — indicated that Pumpernickel did indeed have adequately hot reservoirs, and showed where to drill for them. “We were able to take the prior data that was seen to be a failure, plug it into these models, and get not just the surface locations that we should drill from, but [the models] even helped us identify what angle and which direction to drill the well,” Hoiland told me.
That’s wildly different from the way geothermal exploration typically works, he explained. Traditionally, a geologist would arrive onsite with their own mental model of the subsurface and tell the team where to drill. “But there are millions of possible models, and there’s no way humans can model all of those fully and quantitatively,” Hoiland told me, hence the industry’s low success rate for exploratory wells. Zanskar can, though. By modeling all possible locations for geothermal reservoirs, the startup’s tools “create a probability distribution that allows you to make decisions with more confidence.”
To build these tools, Hoiland and his cofounder, Joel Edwards, both of whom have backgrounds in geology, tracked down and acquired long forgotten analog data sets mapping the subsurface of regions that were never developed. They digitized these records and fed them into their AI model, which is also trained on fresh inputs from Zanksar’s own data collection team, a group the company launched three years ago. After adding all this information, the team realized that test wells had been drilled in only about 5% of the “geothermally prospective areas of the western U.S.,” leaving the startup with no shortage of additional sites to explore.
“It’s been nine years since a greenfield geothermal plant has been built in the U.S.,” Edwards told me, meaning one constructed on land with no prior geothermal development. “So the intent here is to restart that flywheel of developing greenfield geothermal again.” And while Zanskar would not confirm, Axios reported earlier this month that the company is now seeking to raise a $100 million Series C round to help accomplish this goal.
In the future, Zanskar plans to test and develop sites where exploratory drilling has never even taken place, something the industry essentially stopped attempting decades ago. But these hitherto unknown sites, Edwards said, is where he anticipates “most of the gigawatts” are going to come from in the future.
Hoiland credits all this to advances in AI, which he believes will allow geothermal “to become the cheapest form of energy on the planet,” he told me. Because “if you knew exactly where to drill today, it already would be.”
On EPA’s climate denial, virtual power plants, and Europe’s $50 billion climate reality
Current conditions: In the Atlantic, Tropical Storm Gabrielle is on track to intensify into a hurricane by the weekend, but it’s unlikely to affect the U.S. East Coast • Most of Vermont, New Hampshire, and Maine are under “severe” drought warning • Southeastern Nigeria is facing flooding.
The Federal Reserve announced Wednesday its first interest rate cut of the year, a quarter percentage point drop that aims to bring the federal funds rate down to between 4% and 4.25%. This may, Heatmap’s Matthew Zeitlin reported, “provide some relief to renewables developers and investors, who are especially sensitive to financing costs.” As Advait Arun, a climate and infrastructure analyst at the Center for Public Enterprise, told him: “high rates are never going to be exactly a good thing … it’s going to be good that we’re finally seeing cuts.”
Since solar and wind rely on basically free fuel, the bulk of developers’ costs to build panels or turbines are upfront. That requires borrowing money, meaning interest rates have an outsize impact on the total cost of renewable projects. Renewables carry more debt than fossil fuel plants. When interest rates rise by 2 percentage points, the levelized cost of electricity for renewables rises by 20%, compared to 11% for a gas fired plant, according to a report last year by the energy consultancy Wood Mackenzie.
The United States’ leading scientific advisory body issued what The New York Times called a “major report” on Wednesday detailing “the strongest evidence to date that carbon dioxide, methane, and other planet-warming greenhouse gases are threatening human health.” The study, published by the National Academies of Sciences, Engineering, and Medicine, stands athwart the Environmental Protection Agency’s proposal to revoke the endangerment finding. Established in 2009, the legal determination that planet-heating gases cause harm to human health means that the Clean Air Act can be used to underpin regulations on emissions. But the Trump administration proposed rescinding the finding and insisted it could “cast significant doubt” on its accuracy. “
“It’s more serious and more long term damage for them to try to rescind the underlying endangerment finding because depending on what the Supreme Court does with that, it could knock out a future administration from trying to bring it back,” Harvard Law School’s Jody Freeman told Heatmap’s Emily Pontecorvo in July. “Now that would be the nuclear option. That would be their best case scenario. I don’t think that’s likely, but it’s possible.”
Get Heatmap AM directly in your inbox every morning:
It’s an unlikely scenario. But if all U.S. households built rooftop solar panels and batteries, and adopted efficient electric appliances, the country could offset all the growing demand from data centers. That’s according to a new report by the pro-electrification nonprofit Rewiring America. “Electrifying households is a direct path to meeting the growing power needs of hyperscale data centers while creating a more flexible, resilient, cost-effective grid for all,” Ari Matusiak, the chief executive of Rewiring America, said in a statement. “The household doesn’t have to be a passive energy consumer, at the whim of rising costs. Instead, it can be the hero and, with smart investment, the foundation of a more reliable and affordable energy future.”
With new gas plants, nuclear reactors, and geothermal stations in the works, the U.S. is nowhere close to following a maximalist vision of distributed resources. But the findings highlight how much additional power could be generated on residential rooftops across the U.S. that, if combined with virtual power plant software, could comprise a large new source of clean electricity.
A scorecard highlighting all the ways the virtual power plant industry has grown.Wood Mackenzie
That isn’t to say virtual power plants aren’t having something of a moment. New data from Wood Mackenzie found that virtual power plant capacity expanded 13.7% year over year to reach 37.5 gigawatts. California, Texas, New York, and Massachusetts are the leading states, representing 37% of all VPP deployments. The market last year “broadened more than it deepened,” the consultancy’s report found, with the number of deployments, offtakers, and policy support spurring more adoption. But the residential side remains modest. Their share of the VPP wholesale market’s capacity increased to 10.2% from only about 8.8% last year, “still reflecting market barriers to small customers,” such as access to data and market rules.
“Utility program caps, capacity accreditation reforms, and market barriers have prevented capacity from growing as fast as market activity,” Ben Hertz-Shargel, global head of grid edge for Wood Mackenzie, said in a statement. He added that, “while data centers are the source of new load, there’s an enormous opportunity to tap VPPs as the new source of grid flexibility.”
Record-breaking heat, droughts, fires, and floods cost the European economy at least 43 billion euros, or $50 billion, a new European Central Bank study found. The research, presented this week to European Union lawmakers, used a model based on weather data and estimates of historical impact of extreme weather on 1,160 different regions across the 27-nation bloc. “The true costs of extreme weather surface slowly because these events affect lives and livelihoods through a wide range of channels that extend beyond the initial impact,” Sehrish Usman, an assistant professor at the University of Mannheim who led the study with two economists from the European Central Bank, told The New York Times.
Secretary of Energy Chris Wright believes nuclear fusion plants will be pumping electricity onto grids no later than 2040. In an interview this week with the BBC while traveling in Europe, Wright said he expected the technology to be commercialized in as little as eight years. “With artificial intelligence and what's going on at the national labs and private companies in the United States, we will have that approach about how to harness fusion energy multiple ways within the next five years," Wright told the broadcaster. “The technology, it'll be on the electric grid, you know, in eight to 15 years.” As Heatmap’s Katie Brigham put it recently, it’s “finally, possibly, almost time for fusion.”