You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
It took the market about a week to catch up to the fact that the Chinese artificial intelligence firm DeepSeek had released an open-source AI model that rivaled those from prominent U.S. companies such as OpenAI and Anthropic — and that, most importantly, it had managed to do so much more cheaply and efficiently than its domestic competitors. The news cratered not only tech stocks such as Nvidia, but energy stocks, as well, leading to assumptions that investors thought more-energy efficient AI would reduce energy demand in the sector overall.
But will it really? While some in climate world assumed the same and celebrated the seemingly good news, many venture capitalists, AI proponents, and analysts quickly arrived at essentially the opposite conclusion — that cheaper AI will only lead to greater demand for AI. The resulting unfettered proliferation of the technology across a wide array of industries could thus negate the energy efficiency gains, ultimately leading to a substantial net increase in data center power demand overall.
“With cost destruction comes proliferation,” Susan Su, a climate investor at the venture capital firm Toba Capital, told me. “Plus the fact that it’s open source, I think, is a really, really big deal. It puts the power to expand and to deploy and to proliferate into billions of hands.”
If you’ve seen lots of chitchat about Jevons paradox of late, that’s basically what this line of thinking boils down to. After Microsoft’s CEO Satya Nadella responded to DeepSeek mania by posting the Wikipedia page for this 19th century economic theory on X, many (myself included) got a quick crash course on its origins. The idea is that as technical efficiencies of the Victorian era made burning coal cheaper, demand for — and thus consumption of — coal actually increased.
While this is a distinct possibility in the AI space, it’s by no means a guarantee. “This is very much, I think, an open question,“ energy expert Nat Bullard told me, with regards to whether DeepSeek-type models will spur a reduction or increase in energy demand. “I sort of lean in both directions at once.” Formerly the chief content officer at BloombergNEF and current co-founder of the AI startup Halcyon, a search and information platform for energy professionals, Bullard is personally excited for the greater efficiencies and optionality that new AI models can bring to his business.
But he warns that just because DeepSeek was cheap to train — the company claims it cost about $5.5 million, while domestic models cost hundreds of millions or even billions — doesn’t mean that it’s cheap or energy-efficient to operate. “Training more efficiently does not necessarily mean that you can run it that much more efficiently,” Bullard told me. When a large language model answers a question or provides any type of output, it’s said to be making an “inference.” And as Bullard explains, “That may mean, as we move into an era of more and more inference and not just training, then the [energy] impacts could be rather muted.”
DeepSeek-R1, the name for the model that caused the investor freakout, is also a newer type of LLM that uses more energy in general. Up until literally a few days ago, when OpenAI released o3-mini for free, most casual users were probably interacting with so-called “pretrained” AI models. Fed on gobs of internet text, these LLMs spit out answers based primarily on prediction and pattern recognition. DeepSeek released a model like this, called V3, in September. But last year, more advanced “reasoning” models, which can “think,” in some sense, started blowing up. These models — which include o3-mini, the latest version of Anthropic’s Claude, and the now infamous DeepSeek-R1 — have the ability to try out different strategies to arrive at the correct answer, recognize their mistakes, and improve their outputs, allowing for significant advancements in areas such as math and coding.
But all that artificial reasoning eats up a lot of energy. As Sasha Luccioni, the AI and climate lead at Hugging Face, which makes an open-source platform for AI projects, wrote on LinkedIn, “To set things clear about DeepSeek + sustainability: (it seems that) training is much shorter/cheaper/more efficient than traditional LLMs, *but* inference is longer/more expensive/less efficient because of the chain of thought aspect.” Chain of thought refers to the reasoning process these newer models undertake. Luccioni wrote that she’s currently working to evaluate the energy efficiency of both the DeepSeek V3 and R1 models.
Another factor that could influence energy demand is how fast domestic companies respond to the DeepSeek breakthrough with their own new and improved models. Amy Francetic, co-founder at Buoyant Ventures, doesn’t think we’ll have to wait long. “One effect of DeepSeek is that it will highly motivate all of the large LLMs in the U.S. to go faster,” she told me. And because a lot of the big players are fundamentally constrained by energy availability, she’s crossing her fingers that this means they’ll work smarter, not harder. “Hopefully it causes them to find these similar efficiencies rather than just, you know, pouring more gasoline into a less fuel-efficient vehicle.”
In her recent Substack post, Su described three possible futures when it comes to AI’s role in the clean energy transition. The ideal is that AI demand scales slowly enough that nuclear and renewables scale with it. The least hopeful is that immediate, exponential growth in AI demand leads to a similar expansion of fossil fuels, locking in new dirty infrastructure for decades. “I think that's already been happening,” Su told me. And then there’s the techno-optimist scenario, linked to figures like Sam Altman, which Su doesn’t put much stock in — that AI “drives the energy revolution” by helping to create new energy technologies and efficiencies that more than offset the attendant increase in energy demand.
Which scenario predominates could also depend upon whether greater efficiencies, combined with the adoption of AI by smaller, more shallow-pocketed companies, leads to a change in the scale of data centers. “There’s going to be a lot more people using AI. So maybe that means we don’t need these huge, gigawatt data centers. Maybe we need a lot more smaller, megawatt-size data centers,” Laura Katzman, a principal at Buoyant Ventures, told me. Katzman has conducted research for the firm on data center decarbonization.
Smaller data centers with a subsequently smaller energy footprint could pair well with renewable-powered microgrids, which are less practical and economically feasible for hyperscalers. That could be a big win for solar and wind plus battery storage, Katzman explained, but a boondoggle for companies such as Microsoft, which has famously committed to re-opening Pennsylvania’s Three Mile Island nuclear plant to power its data centers. “Because of DeepSeek, the expected price of compute probably doesn’t justify now turning back on some of these nuclear plants, or these other high-cost energy sources,” Katzman told me.
Lastly, it remains to be seen what nascent applications cheaper models will open up. “If somebody, say, in the Philippines or Vietnam has an interest in applying this to their own decarbonization challenge, what would they come up with?” Bullard pondered. “I don’t yet know what people would do with greater capability and lower costs and a different set of problems to solve for. And that’s really exciting to me.”
But even if the AI pessimists are right, and these newer models don’t make AI ubiquitously useful for applications from new drug discovery to easier regulatory filing, Su told me that in a certain sense, it doesn't matter much. “If there was a possibility that somebody had this type of power, and you could have it too, would you sit on the couch? Or would you arms race them? I think that is going to drive energy demand, irrespective of end utility.”
As Su told me, “I do not think there’s actually a saturation point for this.”
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Elemental Impact, Breakthrough Energy, Speed & Scale, Stanford, Energy Innovation, and McKinsey are all partnering to form the “Climate Tech Atlas.”
The federal government has become an increasingly unreliable partner to climate tech innovators. Now venture capitalists, nonprofits, and academics are embracing a new plan to survive.
On Thursday, an interdisciplinary coalition — including Breakthrough Energy, McKinsey, and Stanford University’s Doerr School of Sustainability — unveiled the Climate Tech Atlas, a new plan to map out opportunities in the sector and define innovation imperatives critical to the energy transition.
The goal is to serve as a resource for stakeholders across the industry, drawing their focus toward the technological frontiers the alliance sees as the most viable pathways to economy-wide decarbonization. The idea is not to eliminate potential solutions, but rather “to enable the next generation of innovators, entrepreneurs, researchers, policymakers, and investors to really focus on where we felt there was the largest opportunity for exploration and for innovation to impact our path to net zero through the lens of technology,” Cooper Rinzler, a key collaborator on the initiative and a partner at the venture capital firm Breakthrough Energy Ventures, told me.
Other core contributors include the nonprofit investor Elemental Impact, John Doerr’s climate initiative Speed & Scale, and the policy think tank Energy Innovation. The Atlas has been a year in the making, Ryan Panchadsaram of Speed & Scale told me. “We’ve had maybe close to 20 to 30 working sessions with 80 different contributors, all focused on the big question of what innovations are needed to decarbonize our economy.”
The website, which launched today, lays out 24 opportunity areas across buildings, manufacturing, transportation, food, agriculture and nature, electricity, and greenhouse gas removal. Diving into “buildings,” for example, one can then drill down into an opportunity area such as “sustainable construction and design,” which lists three innovation imperatives: creating new design tools to improve materials efficiency and carbon intensity, improving building insulation and self-cooling, and industrializing construction to make it faster and more modular.
Then there are the moonshots — 39 in total, and two for this opportunity in particular. The first is developing carbon-negative building coatings and surface materials, and the second is inventing low-carbon building materials that can outperform steel and cement. It’s these types of moonshots, Rinzler told me, where much of the “residual uncertainty” and thus “opportunity for surprise” lies.
Each core collaborator, Panchadsaram said, naturally came into this exercise with their own internal lists and ideas about what types of tech and basic research were needed most. The idea, he told me, was to share “an open source version of what we each had.”
As Dawn Lippert, founder and CEO of Elemental Impact, put it to me, the Atlas “can help accelerate any conversation.” Her firm meets with over 1,000 entrepreneurs per year, she explained, on top of numerous philanthropists trying to figure out where to direct their capital. The Atlas can serve as a one-stop-shop to help them channel their efforts — and dollars — into the most investable and salient opportunities.
The same can be said for research priorities among university faculty, Charlotte Pera, the executive director of Stanford’s Sustainability Accelerator, told me. That then trickles down to help determine what classes, internships, and career paths students interested in the intersection of sustainability and technology ultimately choose.
The coalition members — and the project itself — speak to the prudence of this type of industry-wide level-setting amidst a chaotic political and economic environment. Referencing the accelerants Speed & Scale identifies as critical to achieving net-zero emissions — policy, grassroots and global movements, innovation, and investment — Panchadsaram told me that “when one is not performing in the way that you want, you have to lean in more into the others.”
These days, of course, it’s U.S. policy that’s falling short. “In this moment in time, at least domestically, innovation and investment is one that can start to fill in that gap,” he said.
This isn’t the first effort to meticulously map out where climate funding, innovation, and research efforts should be directed. Biden’s Department of Energy launched the Earthshots Initiative, which laid out innovation goals and pathways to scale for emergent technologies such as clean hydrogen, long-duration energy storage, and floating offshore wind. But while it’s safe to say that Trump isn’t pursuing the coordinated funding and research that Earthshots intended to catalyze, the private sector has a long and enthusiastic history with strategic mapping.
Breakthrough Energy, for example, had already pinpointed what it calls the “Five Grand Challenges” in reaching net-zero emissions: electricity, transportation, manufacturing, buildings, and agriculture. It then measures the “green premium” of specific technologies — that is, the added cost of doing a thing cleanly — to pinpoint what to prioritize for near-term deployment and where more research and development funding should be directed. Breakthrough's grand challenges closely mirror the sectors identified in the Atlas, which ultimately goes into far greater depth regarding specific subcategories.
Perhaps the pioneer of climate tech mapping is Kleiner Perkins, the storied venture capital firm, where Doerr was a longtime leader and currently serves as chairman; Panchadsaram is also an advisor there. During what investors often refer to as Clean Tech 1.0 — a boom-and-bust cycle that unfolded from roughly 2006 to 2012 — the firm created a “map of grand challenges.” While it appears to have no internet footprint today, in 2009, Bloomberg described it as a “chart of multicolored squares” tracking the firm’s investment across key climate technologies, with blank spots for tech with the potential to be viable — and investable — in the future.
Many of these opportunities failed to pay off, however. The 2008 financial crisis, the U.S. oil and natural gas boom, and slow development timelines for clean tech contributed to a number of high-profile failures, causing investors to sour on clean tech — a precedent the Atlas coalition would like to avoid.
These days, investors tend to tell me that Clean Tech 1.0 taught them to be realistic about long commercialization timelines for climate tech. Breakthrough Energy Ventures, for example, has funds with lengthy 20-year investment horizons. In a follow-up email, Rinzler also noted that even considering the current political landscape, “there’s a far more robust capital, corporate, and policy environment for climate tech than there was in the 2000s.” Now, he said, investors are more likely to consider the broader landscape across tech, finance, and policy when gauging whether a company can compete in the marketplace. And that often translates to a decreased reliance on government support.
“There are quite a few solutions that are embodied here that really don’t have an obligate dependence on policy in any way,” Rinzler told me. “You don’t have to care about climate to think that this is an amazing opportunity for an entrepreneur to come in and tackle a trillion-dollar industry with a pure profit incentive.”
The Atlas also seeks to offer a realistic perspective on its targets’ commercial maturity via a “Tech Category Index.” For example, the Atlas identifies seven technology categories relevant to the buildings sector: deconstruction, disposal and reuse, green materials, appliances, heating and cooling, smart buildings, and construction. While the first three are deemed “pilot” stage, the rest are “commercial.” More nascent technologies such as fusion, as well as many carbon dioxide removal methods are categorized as “lab” stage.
But the Atlas isn’t yet complete, its creators emphasized. Even now they’re contemplating ways to expand, based on what will provide the most value to the sector. “Is it more details on commercial status? Is it the companies that are working on it? Is it the researchers that are doing this in their lab?” Panchadsaram mused. “We are asking those questions right now.”
There’s even a form where citizen contributors can suggest new innovation imperatives and moonshots, or provide feedback on existing ones. “We do really hope that people, when they see this, collaborate on it, build on it, duplicate it, replicate it,” Panchadsaram told me. “This is truly a starting point.”
Zanskar’s second geothermal discovery is its first on untapped ground.
For the past five years or so, talk of geothermal energy has largely centered on “next-generation” or “enhanced” technologies, which make it possible to develop geothermal systems in areas without naturally occurring hot water reservoirs. But one geothermal exploration and development company, Zanskar, is betting that the scope and potential of conventional geothermal resources has been vastly underestimated — and that artificial intelligence holds the key to unlocking it.
Last year, Zanskar acquired an underperforming geothermal power plant in New Mexico. By combining exclusive data on the subsurface of the region with AI-driven analysis, the company identified a promising new drilling site, striking what has now become the most productive pumped geothermal well in the U.S. Today, the company is announcing its second reservoir discovery, this one at an undeveloped site in northern Nevada, which Zanskar is preparing to turn into a full-scale, 20-megawatt power plant by 2028.
“This is probably one of the biggest confirmed resources in geothermal in the last 10 years,” Zanskar’s cofounder and CEO Carl Hoiland told me. When we first connected back in August, he explained that since founding the company in 2019, he’s become increasingly convinced that conventional geothermal — which taps into naturally occurring reservoirs of hot water and steam — will be the linchpin of the industry’s growth. “We think the estimates of conventional potential that are now decades old just all need to be rewritten,” Hoiland told me. “This is a much larger opportunity than has been previously appreciated.”
The past decade has seen a lull in geothermal development in the U.S. as developers have found exploration costs prohibitively high, especially as solar and wind fall drastically in price. Most new projects have involved either the expansion of existing facilities or tapping areas with established resources, spurring geothermal startups such as Fervo Energy and Sage Geosystems to use next-generation technologies to unlock new areas for development.
But Hoiland told me that in many cases, conventional geothermal plants will prove to be the simplest, most cost-effective path to growth.
Zanskar’s new site, dubbed Pumpernickel, has long drawn interest from potential geothermal developers given that it’s home to a cluster of hot springs. But while both oil and gas companies and the federal government have drilled exploratory wells here intermittently since the 1970s, none hit hot enough temperatures for the reservoirs to be deemed commercially viable.
But Zanksar’s AI models — trained on everything from decades old geological and geophysical data sets to newer satellite and remote sensing databases — indicated that Pumpernickel did indeed have adequately hot reservoirs, and showed where to drill for them. “We were able to take the prior data that was seen to be a failure, plug it into these models, and get not just the surface locations that we should drill from, but [the models] even helped us identify what angle and which direction to drill the well,” Hoiland told me.
That’s wildly different from the way geothermal exploration typically works, he explained. Traditionally, a geologist would arrive onsite with their own mental model of the subsurface and tell the team where to drill. “But there are millions of possible models, and there’s no way humans can model all of those fully and quantitatively,” Hoiland told me, hence the industry’s low success rate for exploratory wells. Zanskar can, though. By modeling all possible locations for geothermal reservoirs, the startup’s tools “create a probability distribution that allows you to make decisions with more confidence.”
To build these tools, Hoiland and his cofounder, Joel Edwards, both of whom have backgrounds in geology, tracked down and acquired long forgotten analog data sets mapping the subsurface of regions that were never developed. They digitized these records and fed them into their AI model, which is also trained on fresh inputs from Zanksar’s own data collection team, a group the company launched three years ago. After adding all this information, the team realized that test wells had been drilled in only about 5% of the “geothermally prospective areas of the western U.S.,” leaving the startup with no shortage of additional sites to explore.
“It’s been nine years since a greenfield geothermal plant has been built in the U.S.,” Edwards told me, meaning one constructed on land with no prior geothermal development. “So the intent here is to restart that flywheel of developing greenfield geothermal again.” And while Zanskar would not confirm, Axios reported earlier this month that the company is now seeking to raise a $100 million Series C round to help accomplish this goal.
In the future, Zanskar plans to test and develop sites where exploratory drilling has never even taken place, something the industry essentially stopped attempting decades ago. But these hitherto unknown sites, Edwards said, is where he anticipates “most of the gigawatts” are going to come from in the future.
Hoiland credits all this to advances in AI, which he believes will allow geothermal “to become the cheapest form of energy on the planet,” he told me. Because “if you knew exactly where to drill today, it already would be.”
On EPA’s climate denial, virtual power plants, and Europe’s $50 billion climate reality
Current conditions: In the Atlantic, Tropical Storm Gabrielle is on track to intensify into a hurricane by the weekend, but it’s unlikely to affect the U.S. East Coast • Most of Vermont, New Hampshire, and Maine are under “severe” drought warning • Southeastern Nigeria is facing flooding.
The Federal Reserve announced Wednesday its first interest rate cut of the year, a quarter percentage point drop that aims to bring the federal funds rate down to between 4% and 4.25%. This may, Heatmap’s Matthew Zeitlin reported, “provide some relief to renewables developers and investors, who are especially sensitive to financing costs.” As Advait Arun, a climate and infrastructure analyst at the Center for Public Enterprise, told him: “high rates are never going to be exactly a good thing … it’s going to be good that we’re finally seeing cuts.”
Since solar and wind rely on basically free fuel, the bulk of developers’ costs to build panels or turbines are upfront. That requires borrowing money, meaning interest rates have an outsize impact on the total cost of renewable projects. Renewables carry more debt than fossil fuel plants. When interest rates rise by 2 percentage points, the levelized cost of electricity for renewables rises by 20%, compared to 11% for a gas fired plant, according to a report last year by the energy consultancy Wood Mackenzie.
The United States’ leading scientific advisory body issued what The New York Times called a “major report” on Wednesday detailing “the strongest evidence to date that carbon dioxide, methane, and other planet-warming greenhouse gases are threatening human health.” The study, published by the National Academies of Sciences, Engineering, and Medicine, stands athwart the Environmental Protection Agency’s proposal to revoke the endangerment finding. Established in 2009, the legal determination that planet-heating gases cause harm to human health means that the Clean Air Act can be used to underpin regulations on emissions. But the Trump administration proposed rescinding the finding and insisted it could “cast significant doubt” on its accuracy. “
“It’s more serious and more long term damage for them to try to rescind the underlying endangerment finding because depending on what the Supreme Court does with that, it could knock out a future administration from trying to bring it back,” Harvard Law School’s Jody Freeman told Heatmap’s Emily Pontecorvo in July. “Now that would be the nuclear option. That would be their best case scenario. I don’t think that’s likely, but it’s possible.”
Get Heatmap AM directly in your inbox every morning:
It’s an unlikely scenario. But if all U.S. households built rooftop solar panels and batteries, and adopted efficient electric appliances, the country could offset all the growing demand from data centers. That’s according to a new report by the pro-electrification nonprofit Rewiring America. “Electrifying households is a direct path to meeting the growing power needs of hyperscale data centers while creating a more flexible, resilient, cost-effective grid for all,” Ari Matusiak, the chief executive of Rewiring America, said in a statement. “The household doesn’t have to be a passive energy consumer, at the whim of rising costs. Instead, it can be the hero and, with smart investment, the foundation of a more reliable and affordable energy future.”
With new gas plants, nuclear reactors, and geothermal stations in the works, the U.S. is nowhere close to following a maximalist vision of distributed resources. But the findings highlight how much additional power could be generated on residential rooftops across the U.S. that, if combined with virtual power plant software, could comprise a large new source of clean electricity.
A scorecard highlighting all the ways the virtual power plant industry has grown.Wood Mackenzie
That isn’t to say virtual power plants aren’t having something of a moment. New data from Wood Mackenzie found that virtual power plant capacity expanded 13.7% year over year to reach 37.5 gigawatts. California, Texas, New York, and Massachusetts are the leading states, representing 37% of all VPP deployments. The market last year “broadened more than it deepened,” the consultancy’s report found, with the number of deployments, offtakers, and policy support spurring more adoption. But the residential side remains modest. Their share of the VPP wholesale market’s capacity increased to 10.2% from only about 8.8% last year, “still reflecting market barriers to small customers,” such as access to data and market rules.
“Utility program caps, capacity accreditation reforms, and market barriers have prevented capacity from growing as fast as market activity,” Ben Hertz-Shargel, global head of grid edge for Wood Mackenzie, said in a statement. He added that, “while data centers are the source of new load, there’s an enormous opportunity to tap VPPs as the new source of grid flexibility.”
Record-breaking heat, droughts, fires, and floods cost the European economy at least 43 billion euros, or $50 billion, a new European Central Bank study found. The research, presented this week to European Union lawmakers, used a model based on weather data and estimates of historical impact of extreme weather on 1,160 different regions across the 27-nation bloc. “The true costs of extreme weather surface slowly because these events affect lives and livelihoods through a wide range of channels that extend beyond the initial impact,” Sehrish Usman, an assistant professor at the University of Mannheim who led the study with two economists from the European Central Bank, told The New York Times.
Secretary of Energy Chris Wright believes nuclear fusion plants will be pumping electricity onto grids no later than 2040. In an interview this week with the BBC while traveling in Europe, Wright said he expected the technology to be commercialized in as little as eight years. “With artificial intelligence and what's going on at the national labs and private companies in the United States, we will have that approach about how to harness fusion energy multiple ways within the next five years," Wright told the broadcaster. “The technology, it'll be on the electric grid, you know, in eight to 15 years.” As Heatmap’s Katie Brigham put it recently, it’s “finally, possibly, almost time for fusion.”