You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
Or at least the team at Emerald AI is going to try.
Everyone’s worried about the ravenous energy needs of AI data centers, which the International Energy Agency projects will help catalyze nearly 4% growth in global electricity demand this year and next, hitting the U.S. power sector particularly hard. On Monday, the Department of Energy released a report adding fuel to that fire, warning that blackouts in the U.S. could become 100 times more common by 2030 in large part due to data centers for AI.
The report stirred controversy among clean energy advocates, who cast doubt on that topline number and thus the paper’s justification for a significant fossil fuel buildout. But no matter how the AI revolution is powered, there’s widespread agreement that it’s going to require major infrastructure development of some form or another.
Not so fast, says Emerald AI, which emerged from stealth last week with $24.5 million in seed funding led by Radical Ventures along with a slew of other big name backers, including Nvidia’s venture arm as well as former Secretary of State John Kerry, Google’s chief scientist Jeff Dean, and Kleiner Perkins chair John Doerr. The startup, founded and led by Orsted’s former chief strategy and innovation officer Varun Sivaram, was built to turn data centers from “grid liabilities into flexible assets” by slowing, pausing, or redirecting AI workloads during times of peak energy demand.
Research shows this type of data center load flexibility could unleash nearly 100 gigawatts of grid capacity — the equivalent of four or five Project Stargates and enough to power about 83 million U.S. homes for a year. Such adjustments, Sivaram told me, would be necessary for only about 0.5% of a data center’s total operating time, a fragment so tiny that he says it renders any resulting training or operating performance dips for AI models essentially negligible.
As impressive as that hypothetical potential is, whether a software product can actually reduce the pressures facing the grid is a high stakes question. The U.S. urgently needs enough energy to serve that data center growth, both to ensure its economic competitiveness and to keep electricity bills affordable for Americans. If an algorithm could help alleviate even some of the urgency of an unprecedented buildout of power plants and transmission infrastructure, well, that’d be a big deal.
While Emerald AI will by no means negate the need to expand and upgrade our energy system, Sivaram told me, the software alone “materially changes the build out needs to meet massive demand expansion,” he said. “It unleashes energy abundance using our existing system.”
Grand as that sounds, the fundamental idea is nothing new. It’s the same concept as a virtual power plant, which coordinates distributed energy resources such as rooftop solar panels, smart thermostats, and electric vehicles to ramp energy supply either up or down in accordance with the grid’s needs.
Adoption of VPPs has lagged far behind their technical potential, however. That’s due to a whole host of policy, regulatory, and market barriers such as a lack of state and utility-level rules around payment structures, insufficient participation incentives for customers and utilities, and limited access to wholesale electricity markets. These programs also depend on widespread customer opt-in to make a real impact on the grid.
“It’s really hard to aggregate enough Nest thermostats to make any kind of dent,”” Sivaram told me. Data centers are different, he said, simply because “they’re enormous, they’re a small city.” They’re also, by nature, virtually controllable and often already interconnected if they’re owned by the same company. Sivaram thinks the potential of flexible data center loads is so promising and the assets themselves so valuable that governments and utilities will opt to organize “bespoke arrangements for data centers to provide their services.”
Sivaram told me he’s also optimistic that utilities will offer data center operators with flexible loads the option to skip the ever-growing interconnection queue, helping hyperscalers get online and turn a profit more quickly.
The potential to jump the queue is not something that utilities have formally advertised as an option, however, although there appears to be growing interest in the idea. An incentive like this will be core to making Emerald AI’s business case work, transmission advocate and president of Grid Strategies Rob Gramlich told me.
Data center developers are spending billions every year on the semiconductor chips powering their AI models, so the typical demand response value proposition — earn a small sum by turning off appliances when the grid is strained — doesn’t apply here. “There’s just not anywhere near enough money in that for a hyperscaler to say, Oh yeah, I’m gonna not run my Nvidia chips for a while to make $200 a megawatt hour. That’s peanuts compared to the bazillions [they] just spent,” Gramlich explained.
For Emerald AI to make a real dent in energy supply and blunt the need for an immediate and enormous grid buildout, a significant number of data center operators will have to adopt the platform. That’s where the partnership with Nvidia comes in handy, Sivaram told me, as the startup is “working with them on the reference architecture” for future AI data centers. “The goal is for all [data centers] to be potentially flexible in the future because there will be a standard reference design,” Sivaram said.
Whether or not data centers will go all in on Nvidia’s design remains to be seen, of course. Hyperscalers have not typically thought of data centers as a flexible asset. Right now, Gramlich said, most are still in the mindset that they need to be operating all 8,760 hours of the year to reach their performance targets.
“Two or three years ago, when we first noticed the surge in AI-driven demand, I talked to every hyperscaler about how flexible they thought they could be, because it seemed intuitive that machine learning might be more flexible than search and streaming,” Gramlich told me. By and large, the response was that while these companies might be interested in exploring flexibility “potentially, maybe, someday,” they were mostly focused on their mandate to get huge amounts of gigawatts online, with little time to explore new data center models.
“Even the ones that are talking about flexibility now, in terms of what they’re actually doing in the market today, they all are demanding 8,760 [hours of operation per year],” Gramlich told me.
Emerald AI is well aware that its business depends on proving to hyperscalers that a degree of flexibility won’t materially impact their operations. Last week, the startup released the results of a pilot demonstration that it ran at an Oracle data center in Phoenix, which proved it was able to reduce power consumption by 25% for three hours during a period of grid stress while still “assuring acceptable customer performance for AI workloads.”
It achieved this by categorizing specific AI tasks — think everything from model training and fine tuning to conversations with chatbots — from high to low priority, indicating the degree to which operations could be slowed while still meeting Oracle’s performance targets. Now, Emerald AI is planning additional, larger-scale demonstrations to showcase its capacity to handle more complex scenarios, such as responding to unexpected grid emergencies.
As transmission planners and hyperscalers alike wait to see more proof validating Emerald AI’s vision of the future, Sivaram is careful to note that his company is not advocating for a halt to energy system expansion. In an increasingly electrified economy, expanding and upgrading the grid will be essential — even if every data center in the world has a flexible load profile.
’We should be building a nationwide transmission system. We should be building out generation. We should be doing grid modernization with grid enhancing technologies,” Sivaram told me. “We just don’t need to overdo it. We don’t need the particularly massive projections that you’re seeing that are going to cause your grandmother’s electricity rates to spike. We can avoid that.”
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Elemental Impact, Breakthrough Energy, Speed & Scale, Stanford, Energy Innovation, and McKinsey are all partnering to form the “Climate Tech Atlas.”
The federal government has become an increasingly unreliable partner to climate tech innovators. Now venture capitalists, nonprofits, and academics are embracing a new plan to survive.
On Thursday, an interdisciplinary coalition — including Breakthrough Energy, McKinsey, and Stanford University’s Doerr School of Sustainability — unveiled the Climate Tech Atlas, a new plan to map out opportunities in the sector and define innovation imperatives critical to the energy transition.
The goal is to serve as a resource for stakeholders across the industry, drawing their focus toward the technological frontiers the alliance sees as the most viable pathways to economy-wide decarbonization. The idea is not to eliminate potential solutions, but rather “to enable the next generation of innovators, entrepreneurs, researchers, policymakers, and investors to really focus on where we felt there was the largest opportunity for exploration and for innovation to impact our path to net zero through the lens of technology,” Cooper Rinzler, a key collaborator on the initiative and a partner at the venture capital firm Breakthrough Energy Ventures, told me.
Other core contributors include the nonprofit investor Elemental Impact, John Doerr’s climate initiative Speed & Scale, and the policy think tank Energy Innovation. The Atlas has been a year in the making, Ryan Panchadsaram of Speed & Scale told me. “We’ve had maybe close to 20 to 30 working sessions with 80 different contributors, all focused on the big question of what innovations are needed to decarbonize our economy.”
The website, which launched today, lays out 24 opportunity areas across buildings, manufacturing, transportation, food, agriculture and nature, electricity, and greenhouse gas removal. Diving into “buildings,” for example, one can then drill down into an opportunity area such as “sustainable construction and design,” which lists three innovation imperatives: creating new design tools to improve materials efficiency and carbon intensity, improving building insulation and self-cooling, and industrializing construction to make it faster and more modular.
Then there are the moonshots — 39 in total, and two for this opportunity in particular. The first is developing carbon-negative building coatings and surface materials, and the second is inventing low-carbon building materials that can outperform steel and cement. It’s these types of moonshots, Rinzler told me, where much of the “residual uncertainty” and thus “opportunity for surprise” lies.
Each core collaborator, Panchadsaram said, naturally came into this exercise with their own internal lists and ideas about what types of tech and basic research were needed most. The idea, he told me, was to share “an open source version of what we each had.”
As Dawn Lippert, founder and CEO of Elemental Impact, put it to me, the Atlas “can help accelerate any conversation.” Her firm meets with over 1,000 entrepreneurs per year, she explained, on top of numerous philanthropists trying to figure out where to direct their capital. The Atlas can serve as a one-stop-shop to help them channel their efforts — and dollars — into the most investable and salient opportunities.
The same can be said for research priorities among university faculty, Charlotte Pera, the executive director of Stanford’s Sustainability Accelerator, told me. That then trickles down to help determine what classes, internships, and career paths students interested in the intersection of sustainability and technology ultimately choose.
The coalition members — and the project itself — speak to the prudence of this type of industry-wide level-setting amidst a chaotic political and economic environment. Referencing the accelerants Speed & Scale identifies as critical to achieving net-zero emissions — policy, grassroots and global movements, innovation, and investment — Panchadsaram told me that “when one is not performing in the way that you want, you have to lean in more into the others.”
These days, of course, it’s U.S. policy that’s falling short. “In this moment in time, at least domestically, innovation and investment is one that can start to fill in that gap,” he said.
This isn’t the first effort to meticulously map out where climate funding, innovation, and research efforts should be directed. Biden’s Department of Energy launched the Earthshots Initiative, which laid out innovation goals and pathways to scale for emergent technologies such as clean hydrogen, long-duration energy storage, and floating offshore wind. But while it’s safe to say that Trump isn’t pursuing the coordinated funding and research that Earthshots intended to catalyze, the private sector has a long and enthusiastic history with strategic mapping.
Breakthrough Energy, for example, had already pinpointed what it calls the “Five Grand Challenges” in reaching net-zero emissions: electricity, transportation, manufacturing, buildings, and agriculture. It then measures the “green premium” of specific technologies — that is, the added cost of doing a thing cleanly — to pinpoint what to prioritize for near-term deployment and where more research and development funding should be directed. Breakthrough's grand challenges closely mirror the sectors identified in the Atlas, which ultimately goes into far greater depth regarding specific subcategories.
Perhaps the pioneer of climate tech mapping is Kleiner Perkins, the storied venture capital firm, where Doerr was a longtime leader and currently serves as chairman; Panchadsaram is also an advisor there. During what investors often refer to as Clean Tech 1.0 — a boom-and-bust cycle that unfolded from roughly 2006 to 2012 — the firm created a “map of grand challenges.” While it appears to have no internet footprint today, in 2009, Bloomberg described it as a “chart of multicolored squares” tracking the firm’s investment across key climate technologies, with blank spots for tech with the potential to be viable — and investable — in the future.
Many of these opportunities failed to pay off, however. The 2008 financial crisis, the U.S. oil and natural gas boom, and slow development timelines for clean tech contributed to a number of high-profile failures, causing investors to sour on clean tech — a precedent the Atlas coalition would like to avoid.
These days, investors tend to tell me that Clean Tech 1.0 taught them to be realistic about long commercialization timelines for climate tech. Breakthrough Energy Ventures, for example, has funds with lengthy 20-year investment horizons. In a follow-up email, Rinzler also noted that even considering the current political landscape, “there’s a far more robust capital, corporate, and policy environment for climate tech than there was in the 2000s.” Now, he said, investors are more likely to consider the broader landscape across tech, finance, and policy when gauging whether a company can compete in the marketplace. And that often translates to a decreased reliance on government support.
“There are quite a few solutions that are embodied here that really don’t have an obligate dependence on policy in any way,” Rinzler told me. “You don’t have to care about climate to think that this is an amazing opportunity for an entrepreneur to come in and tackle a trillion-dollar industry with a pure profit incentive.”
The Atlas also seeks to offer a realistic perspective on its targets’ commercial maturity via a “Tech Category Index.” For example, the Atlas identifies seven technology categories relevant to the buildings sector: deconstruction, disposal and reuse, green materials, appliances, heating and cooling, smart buildings, and construction. While the first three are deemed “pilot” stage, the rest are “commercial.” More nascent technologies such as fusion, as well as many carbon dioxide removal methods are categorized as “lab” stage.
But the Atlas isn’t yet complete, its creators emphasized. Even now they’re contemplating ways to expand, based on what will provide the most value to the sector. “Is it more details on commercial status? Is it the companies that are working on it? Is it the researchers that are doing this in their lab?” Panchadsaram mused. “We are asking those questions right now.”
There’s even a form where citizen contributors can suggest new innovation imperatives and moonshots, or provide feedback on existing ones. “We do really hope that people, when they see this, collaborate on it, build on it, duplicate it, replicate it,” Panchadsaram told me. “This is truly a starting point.”
Zanskar’s second geothermal discovery is its first on untapped ground.
For the past five years or so, talk of geothermal energy has largely centered on “next-generation” or “enhanced” technologies, which make it possible to develop geothermal systems in areas without naturally occurring hot water reservoirs. But one geothermal exploration and development company, Zanskar, is betting that the scope and potential of conventional geothermal resources has been vastly underestimated — and that artificial intelligence holds the key to unlocking it.
Last year, Zanskar acquired an underperforming geothermal power plant in New Mexico. By combining exclusive data on the subsurface of the region with AI-driven analysis, the company identified a promising new drilling site, striking what has now become the most productive pumped geothermal well in the U.S. Today, the company is announcing its second reservoir discovery, this one at an undeveloped site in northern Nevada, which Zanskar is preparing to turn into a full-scale, 20-megawatt power plant by 2028.
“This is probably one of the biggest confirmed resources in geothermal in the last 10 years,” Zanskar’s cofounder and CEO Carl Hoiland told me. When we first connected back in August, he explained that since founding the company in 2019, he’s become increasingly convinced that conventional geothermal — which taps into naturally occurring reservoirs of hot water and steam — will be the linchpin of the industry’s growth. “We think the estimates of conventional potential that are now decades old just all need to be rewritten,” Hoiland told me. “This is a much larger opportunity than has been previously appreciated.”
The past decade has seen a lull in geothermal development in the U.S. as developers have found exploration costs prohibitively high, especially as solar and wind fall drastically in price. Most new projects have involved either the expansion of existing facilities or tapping areas with established resources, spurring geothermal startups such as Fervo Energy and Sage Geosystems to use next-generation technologies to unlock new areas for development.
But Hoiland told me that in many cases, conventional geothermal plants will prove to be the simplest, most cost-effective path to growth.
Zanskar’s new site, dubbed Pumpernickel, has long drawn interest from potential geothermal developers given that it’s home to a cluster of hot springs. But while both oil and gas companies and the federal government have drilled exploratory wells here intermittently since the 1970s, none hit hot enough temperatures for the reservoirs to be deemed commercially viable.
But Zanksar’s AI models — trained on everything from decades old geological and geophysical data sets to newer satellite and remote sensing databases — indicated that Pumpernickel did indeed have adequately hot reservoirs, and showed where to drill for them. “We were able to take the prior data that was seen to be a failure, plug it into these models, and get not just the surface locations that we should drill from, but [the models] even helped us identify what angle and which direction to drill the well,” Hoiland told me.
That’s wildly different from the way geothermal exploration typically works, he explained. Traditionally, a geologist would arrive onsite with their own mental model of the subsurface and tell the team where to drill. “But there are millions of possible models, and there’s no way humans can model all of those fully and quantitatively,” Hoiland told me, hence the industry’s low success rate for exploratory wells. Zanskar can, though. By modeling all possible locations for geothermal reservoirs, the startup’s tools “create a probability distribution that allows you to make decisions with more confidence.”
To build these tools, Hoiland and his cofounder, Joel Edwards, both of whom have backgrounds in geology, tracked down and acquired long forgotten analog data sets mapping the subsurface of regions that were never developed. They digitized these records and fed them into their AI model, which is also trained on fresh inputs from Zanksar’s own data collection team, a group the company launched three years ago. After adding all this information, the team realized that test wells had been drilled in only about 5% of the “geothermally prospective areas of the western U.S.,” leaving the startup with no shortage of additional sites to explore.
“It’s been nine years since a greenfield geothermal plant has been built in the U.S.,” Edwards told me, meaning one constructed on land with no prior geothermal development. “So the intent here is to restart that flywheel of developing greenfield geothermal again.” And while Zanskar would not confirm, Axios reported earlier this month that the company is now seeking to raise a $100 million Series C round to help accomplish this goal.
In the future, Zanskar plans to test and develop sites where exploratory drilling has never even taken place, something the industry essentially stopped attempting decades ago. But these hitherto unknown sites, Edwards said, is where he anticipates “most of the gigawatts” are going to come from in the future.
Hoiland credits all this to advances in AI, which he believes will allow geothermal “to become the cheapest form of energy on the planet,” he told me. Because “if you knew exactly where to drill today, it already would be.”
On EPA’s climate denial, virtual power plants, and Europe’s $50 billion climate reality
Current conditions: In the Atlantic, Tropical Storm Gabrielle is on track to intensify into a hurricane by the weekend, but it’s unlikely to affect the U.S. East Coast • Most of Vermont, New Hampshire, and Maine are under “severe” drought warning • Southeastern Nigeria is facing flooding.
The Federal Reserve announced Wednesday its first interest rate cut of the year, a quarter percentage point drop that aims to bring the federal funds rate down to between 4% and 4.25%. This may, Heatmap’s Matthew Zeitlin reported, “provide some relief to renewables developers and investors, who are especially sensitive to financing costs.” As Advait Arun, a climate and infrastructure analyst at the Center for Public Enterprise, told him: “high rates are never going to be exactly a good thing … it’s going to be good that we’re finally seeing cuts.”
Since solar and wind rely on basically free fuel, the bulk of developers’ costs to build panels or turbines are upfront. That requires borrowing money, meaning interest rates have an outsize impact on the total cost of renewable projects. Renewables carry more debt than fossil fuel plants. When interest rates rise by 2 percentage points, the levelized cost of electricity for renewables rises by 20%, compared to 11% for a gas fired plant, according to a report last year by the energy consultancy Wood Mackenzie.
The United States’ leading scientific advisory body issued what The New York Times called a “major report” on Wednesday detailing “the strongest evidence to date that carbon dioxide, methane, and other planet-warming greenhouse gases are threatening human health.” The study, published by the National Academies of Sciences, Engineering, and Medicine, stands athwart the Environmental Protection Agency’s proposal to revoke the endangerment finding. Established in 2009, the legal determination that planet-heating gases cause harm to human health means that the Clean Air Act can be used to underpin regulations on emissions. But the Trump administration proposed rescinding the finding and insisted it could “cast significant doubt” on its accuracy. “
“It’s more serious and more long term damage for them to try to rescind the underlying endangerment finding because depending on what the Supreme Court does with that, it could knock out a future administration from trying to bring it back,” Harvard Law School’s Jody Freeman told Heatmap’s Emily Pontecorvo in July. “Now that would be the nuclear option. That would be their best case scenario. I don’t think that’s likely, but it’s possible.”
Get Heatmap AM directly in your inbox every morning:
It’s an unlikely scenario. But if all U.S. households built rooftop solar panels and batteries, and adopted efficient electric appliances, the country could offset all the growing demand from data centers. That’s according to a new report by the pro-electrification nonprofit Rewiring America. “Electrifying households is a direct path to meeting the growing power needs of hyperscale data centers while creating a more flexible, resilient, cost-effective grid for all,” Ari Matusiak, the chief executive of Rewiring America, said in a statement. “The household doesn’t have to be a passive energy consumer, at the whim of rising costs. Instead, it can be the hero and, with smart investment, the foundation of a more reliable and affordable energy future.”
With new gas plants, nuclear reactors, and geothermal stations in the works, the U.S. is nowhere close to following a maximalist vision of distributed resources. But the findings highlight how much additional power could be generated on residential rooftops across the U.S. that, if combined with virtual power plant software, could comprise a large new source of clean electricity.
A scorecard highlighting all the ways the virtual power plant industry has grown.Wood Mackenzie
That isn’t to say virtual power plants aren’t having something of a moment. New data from Wood Mackenzie found that virtual power plant capacity expanded 13.7% year over year to reach 37.5 gigawatts. California, Texas, New York, and Massachusetts are the leading states, representing 37% of all VPP deployments. The market last year “broadened more than it deepened,” the consultancy’s report found, with the number of deployments, offtakers, and policy support spurring more adoption. But the residential side remains modest. Their share of the VPP wholesale market’s capacity increased to 10.2% from only about 8.8% last year, “still reflecting market barriers to small customers,” such as access to data and market rules.
“Utility program caps, capacity accreditation reforms, and market barriers have prevented capacity from growing as fast as market activity,” Ben Hertz-Shargel, global head of grid edge for Wood Mackenzie, said in a statement. He added that, “while data centers are the source of new load, there’s an enormous opportunity to tap VPPs as the new source of grid flexibility.”
Record-breaking heat, droughts, fires, and floods cost the European economy at least 43 billion euros, or $50 billion, a new European Central Bank study found. The research, presented this week to European Union lawmakers, used a model based on weather data and estimates of historical impact of extreme weather on 1,160 different regions across the 27-nation bloc. “The true costs of extreme weather surface slowly because these events affect lives and livelihoods through a wide range of channels that extend beyond the initial impact,” Sehrish Usman, an assistant professor at the University of Mannheim who led the study with two economists from the European Central Bank, told The New York Times.
Secretary of Energy Chris Wright believes nuclear fusion plants will be pumping electricity onto grids no later than 2040. In an interview this week with the BBC while traveling in Europe, Wright said he expected the technology to be commercialized in as little as eight years. “With artificial intelligence and what's going on at the national labs and private companies in the United States, we will have that approach about how to harness fusion energy multiple ways within the next five years," Wright told the broadcaster. “The technology, it'll be on the electric grid, you know, in eight to 15 years.” As Heatmap’s Katie Brigham put it recently, it’s “finally, possibly, almost time for fusion.”