You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
Two former Microsoft employees have turned their frustration into an awareness campaign to hold tech companies accountable.
When the clean energy world considers the consequences of the artificial intelligence boom, rising data center electricity demand and the strain it’s putting on the grid is typically top of mind — even if that’s weighed against the litany of potential positive impacts, which includes improved weather forecasting, grid optimization, wildfire risk mitigation, critical minerals discovery, and geothermal development.
I’ve written about a bunch of it. But the not-so-secret flip side is that naturally, any AI-fueled improvements in efficiency, data analytics, and predictive capabilities will benefit well-capitalized fossil fuel giants just as much — if not significantly more — than plucky climate tech startups or cash-strapped utilities.
“The narrative is a net impact equation that only includes the positive use cases of AI as compared to the operational impacts, which we believe is apples to oranges,” Holly Alpine, co-founder of the Enabled Emissions Campaign, told me. “We need to expand that conversation and include the negative applications in that scoreboard.”
Alpine founded the campaign alongside her partner, Will Alpine, in February of last year, with the goal of holding tech giants accountable for the ways users leverage their products to accelerate fossil fuel production. Both formerly worked for Microsoft on sustainability initiatives related to data centers and AI, but quit after what they told me amounted to a string of unfulfilled promises by the company and a realization that internal pressure alone couldn’t move the needle as far as they’d hoped.
While at Microsoft, they were dismayed to learn that the company had contracts for its cloud services and suite of AI tools with some of the largest fossil fuel corporations in the world — including ExxonMobil, Chevron, and Shell — and that the partnerships were formed with the explicit intent to expand oil and gas production. Other hyperscalers such as Google and Amazon have also formed similar cloud and AI service partnerships with oil and gas giants, though Google burnished its sustainability bona fides in 2020 by announcing that it would no longer build custom AI tools for the fossil fuel industry. (In response to my request for comment, Microsoft directed me to its energy principles, which were written in 2022, while the Alpines were still with the company, and to its 2025 sustainability report. Neither addresses the Alpines’ concerns directly, which is perhaps telling in its own right.)
AI can help fossil fuel companies accelerate and expand fossil fuel production throughout all stages of the process, from exploration and reservoir modeling to predictive maintenance, transport and logistics optimization, demand forecasting, and revenue modeling. And while partnerships with AI hyperscalers can be extremely beneficial, oil and gas companies are also building out their own AI-focused teams and capabilities in-house.
“As a lot of the low-hanging fruit in the oil reserve space has been plucked, companies have been increasingly relying on things like fracking and offshore drilling to stay competitive,” Will told me. “So using AI is now allowing those operations to continue in a way that they previously could not.”
Exxon, for example, boasts on its website that it’s “the first in our industry to leverage autonomous drilling in deep water,” thanks to its AI-powered systems that can determine drilling parameters and control the whole process sans human intervention. Likewise, BP notes that its "Optimization Genie” AI tool has helped it increase production by about 2,000 oil-equivalent barrels per day in the Gulf of Mexico, and that between 2022 and 2024, AI and advanced analytics allowed the company to increase production by 4% overall.
In general, however, the degree to which AI-enabled systems help expand production is not something companies speak about publicly. For instance, when Microsoft inked a contract with Exxon six years ago, it predicted that its suite of digital products would enable the oil giant to grow production in the Permian Basin by up to 50,000 barrels by 2025. And while output in the Permian has boomed, it’s unclear how much Microsoft is to thank for that as neither company has released any figures.
Either way, many of the climate impacts of using AI for oil and gas production are likely to go unquantified. That’s because the so-called “enabled emissions” from the tech sector are not captured by the standard emissions accounting framework, which categorizes direct emissions from a company’s operations as scope 1, indirect emissions from the generation of purchased energy as scope 2, and all other emissions across the value chain as scope 3. So while tailpipe emissions, for example, would fall into Exxon’s scope 3 bucket — thus requiring disclosure — they’re outside Microsoft’s reporting boundaries.
According to the Alpines’ calculations, though, Microsoft’s deal with Exxon plus another contract with Chevron totalled “over 300% of Microsoft’s entire carbon footprint, including data centers.” So it’s really no surprise that hyperscalers have largely fallen silent when it comes to citing specific numbers, given the history of employee blowback and media furor over the friction between tech companies’ sustainability targets and their fossil fuel contracts.
As such, the tech industry often ends up wrapping these deals in broad language highlighting operational efficiency, digital transformation, and even sustainability benefits —- think waste reduction and decreasing methane leakage rates — while glossing over the fact that at their core, these partnerships are primarily designed to increase oil and gas output.
While none of the fossil fuel companies I contacted — Chevron, Exxon, Shell, and BP — replied to my inquiries about the ways they’re leveraging AI, earnings calls and published corporate materials make it clear that the industry is ready to utilize the technology to its fullest extent.
“We’re looking to leverage knowledge in a different way than we have in the past,” Shell CEO Wael Sawan said on the company’s Q2 earnings call last year, citing AI as one of the tools that he sees as integral to “transform the culture of the company to one that is able to outcompete in the coming years.”
Shell has partnered since 2018 with the enterprise software company C3.ai on AI applications such as predictive maintenance, equipment monitoring, and asset optimization, the latter of which has helped the company increase liquid natural gas production by 1% to 2%. C3.ai CEO Tom Siebel was vague on the company’s 2025 Q1 earnings call, but said that Shell estimates that the partnership has “generated annual benefit to Shell of $2 billion.”
In terms of AI’s ability to get more oil and gas out of the ground, “it’s like getting a Kuwait online,” Rakesh Jaggi, who leads the digital efforts at the oil-services giant SLB, told Barron’s magazine. Kuwait is the third largest crude oil producer in OPEC, producing about 2.9 million barrels per day.
Some oil and gas giants were initially reluctant to get fully aboard the AI hype train — even Exxon CEO Darren Woods noted on the company’s 2024 Q3 earnings call that the oil giant doesn’t “like jumping on bandwagons.” Yet he still sees “good potential” for AI to be a “part of the equation” when it comes to the company’s ambition to slash $15 billion in costs by 2027.
Chevron is similarly looking to AI to cut costs. As the company’s Chief Financial Officer Eimear Bonner explained during its 2024 Q4 earnings call, AI could help Chevron save $2 to $3 billion over the next few years as the company looks towards “using technology to do work completely differently.” Meanwhile, Saudi Aramco’s CEO Amin Nasser told Bloomberg that AI is a core reason it’s been able to keep production costs at $3 per barrel for the past 20 years, despite inflation and other headwinds in the sector.
Of course, it should come as no surprise that fossil fuel companies are taking advantage of the vast opportunities that AI provides. After all, the investors and shareholders these companies are ultimately beholden to would likely revolt if they thought their fiduciaries had failed to capitalize on such an enormous technological breakthrough.
The Alpines are well aware that this is the world we live in, and that we’re not going to overthrow capitalism anytime soon. Right now, they told me they’re primarily running a two-person “awareness campaign,” as the general public and sometimes even former colleagues are largely in the dark when it comes to how AI is being used to boost oil and gas production. While Will said they’re “staying small and lean” for now while they fundraise, the campaign has support from a number of allies including the consumer rights group Public Citizen, the tech worker group Amazon Employees for Climate Justice, and the NGO Friends of the Earth.
In the medium term, they’re looking toward policy shifts that would require more disclosure and regulation around AI’s potential for harm in the energy sector. “The only way we believe to really achieve deep change is to raise the floor at an international or national policy level,” Will told me. As an example, he pointed to the EU’s comprehensive regulations that categorize AI use cases by risk level, which then determines the rules these systems are subject to. Police use of facial recognition is considered high risk, for example, while AI spam filters are low risk. Right now, energy sector applications are not categorized as risky at all.
“What we would advocate for would be that AI use in the energy sector falls under a high risk classification system due to its risk for human harm. And then it would go through a governance process, ideally that would align with climate science targets,” Will told me. “So you could use that to uplift positive applications like AI for methane leak detection, but AI for upstream scenarios should be subject to additional scrutiny.”
Realistically, there’s no chance of something like this being implemented in the U.S. under Trump, let alone somewhere like Saudi Arabia. And even if such regulations were eventually enacted in some countries, energy markets are global, meaning governments around the world would ultimately need to align on risk mitigation strategies for reigning in AI’s potential for climate harm.
As Will told me, “that would be a massive uphill battle, but we think it’s one that’s worth fighting.”
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Elemental Impact, Breakthrough Energy, Speed & Scale, Stanford, Energy Innovation, and McKinsey are all partnering to form the “Climate Tech Atlas.”
The federal government has become an increasingly unreliable partner to climate tech innovators. Now venture capitalists, nonprofits, and academics are embracing a new plan to survive.
On Thursday, an interdisciplinary coalition — including Breakthrough Energy, McKinsey, and Stanford University’s Doerr School of Sustainability — unveiled the Climate Tech Atlas, a new plan to map out opportunities in the sector and define innovation imperatives critical to the energy transition.
The goal is to serve as a resource for stakeholders across the industry, drawing their focus toward the technological frontiers the alliance sees as the most viable pathways to economy-wide decarbonization. The idea is not to eliminate potential solutions, but rather “to enable the next generation of innovators, entrepreneurs, researchers, policymakers, and investors to really focus on where we felt there was the largest opportunity for exploration and for innovation to impact our path to net zero through the lens of technology,” Cooper Rinzler, a key collaborator on the initiative and a partner at the venture capital firm Breakthrough Energy Ventures, told me.
Other core contributors include the nonprofit investor Elemental Impact, John Doerr’s climate initiative Speed & Scale, and the policy think tank Energy Innovation. The Atlas has been a year in the making, Ryan Panchadsaram of Speed & Scale told me. “We’ve had maybe close to 20 to 30 working sessions with 80 different contributors, all focused on the big question of what innovations are needed to decarbonize our economy.”
The website, which launched today, lays out 24 opportunity areas across buildings, manufacturing, transportation, food, agriculture and nature, electricity, and greenhouse gas removal. Diving into “buildings,” for example, one can then drill down into an opportunity area such as “sustainable construction and design,” which lists three innovation imperatives: creating new design tools to improve materials efficiency and carbon intensity, improving building insulation and self-cooling, and industrializing construction to make it faster and more modular.
Then there are the moonshots — 39 in total, and two for this opportunity in particular. The first is developing carbon-negative building coatings and surface materials, and the second is inventing low-carbon building materials that can outperform steel and cement. It’s these types of moonshots, Rinzler told me, where much of the “residual uncertainty” and thus “opportunity for surprise” lies.
Each core collaborator, Panchadsaram said, naturally came into this exercise with their own internal lists and ideas about what types of tech and basic research were needed most. The idea, he told me, was to share “an open source version of what we each had.”
As Dawn Lippert, founder and CEO of Elemental Impact, put it to me, the Atlas “can help accelerate any conversation.” Her firm meets with over 1,000 entrepreneurs per year, she explained, on top of numerous philanthropists trying to figure out where to direct their capital. The Atlas can serve as a one-stop-shop to help them channel their efforts — and dollars — into the most investable and salient opportunities.
The same can be said for research priorities among university faculty, Charlotte Pera, the executive director of Stanford’s Sustainability Accelerator, told me. That then trickles down to help determine what classes, internships, and career paths students interested in the intersection of sustainability and technology ultimately choose.
The coalition members — and the project itself — speak to the prudence of this type of industry-wide level-setting amidst a chaotic political and economic environment. “Referencing the accelerants Speed & Scale identifies as critical to achieving net-zero emissions — policy, grassroots and global movements, innovation, and investment — Panchadsaram told me that “when one is not performing in the way that you want, you have to lean in more into the others.”
These days, of course, it’s U.S. policy that’s falling short. “In this moment in time, at least domestically, innovation and investment is one that can start to fill in that gap,” he said.
This isn’t the first effort to meticulously map out where climate funding, innovation, and research efforts should be directed. Biden’s Department of Energy launched the Earthshots Initiative, which laid out innovation goals and pathways to scale for emergent technologies such as clean hydrogen, long-duration energy storage, and floating offshore wind. But while it’s safe to say that Trump isn’t pursuing the coordinated funding and research that Earthshots intended to catalyze, the private sector has a long and enthusiastic history with strategic mapping.
Breakthrough Energy, for example, had already pinpointed what it calls the “Five Grand Challenges” in reaching net-zero emissions: electricity, transportation, manufacturing, buildings, and agriculture. It then measures the “green premium” of specific technologies — that is, the added cost of doing a thing cleanly — to pinpoint what to prioritize for near-term deployment and where more research and development funding should be directed. Breakthrough's grand challenges closely mirror the sectors identified in the Atlas, which ultimately goes into far greater depth regarding specific subcategories.
Perhaps the pioneer of climate tech mapping is Kleiner Perkins, the storied venture capital firm, where Doerr was a longtime leader and currently serves as chairman; Panchadsaram is also an advisor there. During what investors often refer to as Clean Tech 1.0 — a boom-and-bust cycle that unfolded from roughly 2006 to 2012 — the firm created a “map of grand challenges.” While it appears to have no internet footprint today, in 2009, Bloomberg described it as a “chart of multicolored squares” tracking the firm’s investment across key climate technologies, with blank spots for tech with the potential to be viable — and investable — in the future.
Many of these opportunities failed to pay off, however. The 2008 financial crisis, the U.S. oil and natural gas boom, and slow development timelines for clean tech contributed to a number of high-profile failures, causing investors to sour on clean tech — a precedent the Atlas coalition would like to avoid.
These days, investors tend to tell me that Clean Tech 1.0 taught them to be realistic about long commercialization timelines for climate tech. Breakthrough Energy Ventures, for example, has funds with lengthy 20-year investment horizons. In a follow-up email, Rinzler also noted that even considering the current political landscape, “there’s a far more robust capital, corporate, and policy environment for climate tech than there was in the 2000s.” Now, he said, investors are more likely to consider the broader landscape across tech, finance, and policy when gauging whether a company can compete in the marketplace. And that often translates to a decreased reliance on government support.
“There are quite a few solutions that are embodied here that really don’t have an obligate dependence on policy in any way,” Rinzler told me. “You don’t have to care about climate to think that this is an amazing opportunity for an entrepreneur to come in and tackle a trillion-dollar industry with a pure profit incentive.”
The Atlas also seeks to offer a realistic perspective on its targets’ commercial maturity via a “Tech Category Index.” For example, the Atlas identifies seven technology categories relevant to the buildings sector: deconstruction, disposal and reuse, green materials, appliances, heating and cooling, smart buildings, and construction. While the first three are deemed “pilot” stage, the rest are “commercial.” More nascent technologies such as fusion, as well as many carbon dioxide removal methods are categorized as “lab” stage.
But the Atlas isn’t yet complete, its creators emphasized. Even now they’re contemplating ways to expand, based on what will provide the most value to the sector. “Is it more details on commercial status? Is it the companies that are working on it? Is it the researchers that are doing this in their lab?” Panchadsaram mused. “We are asking those questions right now.”
There’s even a form where citizen contributors can suggest new innovation imperatives and moonshots, or provide feedback on existing ones. “We do really hope that people, when they see this, collaborate on it, build on it, duplicate it, replicate it,” Panchadsaram told me. “This is truly a starting point.”
Zanskar’s second geothermal discovery is its first on untapped ground.
For the past five years or so, talk of geothermal energy has largely centered on “next-generation” or “enhanced” technologies, which make it possible to develop geothermal systems in areas without naturally occurring hot water reservoirs. But one geothermal exploration and development company, Zanskar, is betting that the scope and potential of conventional geothermal resources has been vastly underestimated — and that artificial intelligence holds the key to unlocking it.
Last year, Zanskar acquired an underperforming geothermal power plant in New Mexico. By combining exclusive data on the subsurface of the region with AI-driven analysis, the company identified a promising new drilling site, striking what has now become the most productive pumped geothermal well in the U.S. Today, the company is announcing its second reservoir discovery, this one at an undeveloped site in northern Nevada, which Zanskar is preparing to turn into a full-scale, 20-megawatt power plant by 2028.
“This is probably one of the biggest confirmed resources in geothermal in the last 10 years,” Zanskar’s cofounder and CEO Carl Hoiland told me. When we first connected back in August, he explained that since founding the company in 2019, he’s become increasingly convinced that conventional geothermal — which taps into naturally occurring reservoirs of hot water and steam — will be the linchpin of the industry’s growth. “We think the estimates of conventional potential that are now decades old just all need to be rewritten,” Hoiland told me. “This is a much larger opportunity than has been previously appreciated.”
The past decade has seen a lull in geothermal development in the U.S. as developers have found exploration costs prohibitively high, especially as solar and wind fall drastically in price. Most new projects have involved either the expansion of existing facilities or tapping areas with established resources, spurring geothermal startups such as Fervo Energy and Sage Geosystems to use next-generation technologies to unlock new areas for development.
But Hoiland told me that in many cases, conventional geothermal plants will prove to be the simplest, most cost-effective path to growth.
Zanskar’s new site, dubbed Pumpernickel, has long drawn interest from potential geothermal developers given that it’s home to a cluster of hot springs. But while both oil and gas companies and the federal government have drilled exploratory wells here intermittently since the 1970s, none hit hot enough temperatures for the reservoirs to be deemed commercially viable.
But Zanksar’s AI models — trained on everything from decades old geological and geophysical data sets to newer satellite and remote sensing databases — indicated that Pumpernickel did indeed have adequately hot reservoirs, and showed where to drill for them. “We were able to take the prior data that was seen to be a failure, plug it into these models, and get not just the surface locations that we should drill from, but [the models] even helped us identify what angle and which direction to drill the well,” Hoiland told me.
That’s wildly different from the way geothermal exploration typically works, he explained. Traditionally, a geologist would arrive onsite with their own mental model of the subsurface and tell the team where to drill. “But there are millions of possible models, and there’s no way humans can model all of those fully and quantitatively,” Hoiland told me, hence the industry’s low success rate for exploratory wells. Zanskar can, though. By modeling all possible locations for geothermal reservoirs, the startup’s tools “create a probability distribution that allows you to make decisions with more confidence.”
To build these tools, Hoiland and his cofounder, Joel Edwards, both of whom have backgrounds in geology, tracked down and acquired long forgotten analog data sets mapping the subsurface of regions that were never developed. They digitized these records and fed them into their AI model, which is also trained on fresh inputs from Zanksar’s own data collection team, a group the company launched three years ago. After adding all this information, the team realized that test wells had been drilled in only about 5% of the “geothermally prospective areas of the western U.S.,” leaving the startup with no shortage of additional sites to explore.
“It’s been nine years since a greenfield geothermal plant has been built in the U.S.,” Edwards told me, meaning one constructed on land with no prior geothermal development. “So the intent here is to restart that flywheel of developing greenfield geothermal again.” And while Zanskar would not confirm, Axios reported earlier this month that the company is now seeking to raise a $100 million Series C round to help accomplish this goal.
In the future, Zanskar plans to test and develop sites where exploratory drilling has never even taken place, something the industry essentially stopped attempting decades ago. But these hitherto unknown sites, Edwards said, is where he anticipates “most of the gigawatts” are going to come from in the future.
Hoiland credits all this to advances in AI, which he believes will allow geothermal “to become the cheapest form of energy on the planet,” he told me. Because “if you knew exactly where to drill today, it already would be.”
On EPA’s climate denial, virtual power plants, and Europe’s $50 billion climate reality
Current conditions: In the Atlantic, Tropical Storm Gabrielle is on track to intensify into a hurricane by the weekend, but it’s unlikely to affect the U.S. East Coast • Most of Vermont, New Hampshire, and Maine are under “severe” drought warning • Southeastern Nigeria is facing flooding.
The Federal Reserve announced Wednesday its first interest rate cut of the year, a quarter percentage point drop that aims to bring the federal funds rate down to between 4% and 4.25%. This may, Heatmap’s Matthew Zeitlin reported, “provide some relief to renewables developers and investors, who are especially sensitive to financing costs.” As Advait Arun, a climate and infrastructure analyst at the Center for Public Enterprise, told him: “high rates are never going to be exactly a good thing … it’s going to be good that we’re finally seeing cuts.”
Since solar and wind rely on basically free fuel, the bulk of developers’ costs to build panels or turbines are upfront. That requires borrowing money, meaning interest rates have an outsize impact on the total cost of renewable projects. Renewables carry more debt than fossil fuel plants. When interest rates rise by 2 percentage points, the levelized cost of electricity for renewables rises by 20%, compared to 11% for a gas fired plant, according to a report last year by the energy consultancy Wood Mackenzie.
The United States’ leading scientific advisory body issued what The New York Times called a “major report” on Wednesday detailing “the strongest evidence to date that carbon dioxide, methane, and other planet-warming greenhouse gases are threatening human health.” The study, published by the National Academies of Sciences, Engineering, and Medicine, stands athwart the Environmental Protection Agency’s proposal to revoke the endangerment finding. Established in 2009, the legal determination that planet-heating gases cause harm to human health means that the Clean Air Act can be used to underpin regulations on emissions. But the Trump administration proposed rescinding the finding and insisted it could “cast significant doubt” on its accuracy. “
“It’s more serious and more long term damage for them to try to rescind the underlying endangerment finding because depending on what the Supreme Court does with that, it could knock out a future administration from trying to bring it back,” Harvard Law School’s Jody Freeman told Heatmap’s Emily Pontecorvo in July. “Now that would be the nuclear option. That would be their best case scenario. I don’t think that’s likely, but it’s possible.”
Get Heatmap AM directly in your inbox every morning:
It’s an unlikely scenario. But if all U.S. households built rooftop solar panels and batteries, and adopted efficient electric appliances, the country could offset all the growing demand from data centers. That’s according to a new report by the pro-electrification nonprofit Rewiring America. “Electrifying households is a direct path to meeting the growing power needs of hyperscale data centers while creating a more flexible, resilient, cost-effective grid for all,” Ari Matusiak, the chief executive of Rewiring America, said in a statement. “The household doesn’t have to be a passive energy consumer, at the whim of rising costs. Instead, it can be the hero and, with smart investment, the foundation of a more reliable and affordable energy future.”
With new gas plants, nuclear reactors, and geothermal stations in the works, the U.S. is nowhere close to following a maximalist vision of distributed resources. But the findings highlight how much additional power could be generated on residential rooftops across the U.S. that, if combined with virtual power plant software, could comprise a large new source of clean electricity.
A scorecard highlighting all the ways the virtual power plant industry has grown.Wood Mackenzie
That isn’t to say virtual power plants aren’t having something of a moment. New data from Wood Mackenzie found that virtual power plant capacity expanded 13.7% year over year to reach 37.5 gigawatts. California, Texas, New York, and Massachusetts are the leading states, representing 37% of all VPP deployments. The market last year “broadened more than it deepened,” the consultancy’s report found, with the number of deployments, offtakers, and policy support spurring more adoption. But the residential side remains modest. Their share of the VPP wholesale market’s capacity increased to 10.2% from only about 8.8% last year, “still reflecting market barriers to small customers,” such as access to data and market rules.
“Utility program caps, capacity accreditation reforms, and market barriers have prevented capacity from growing as fast as market activity,” Ben Hertz-Shargel, global head of grid edge for Wood Mackenzie, said in a statement. He added that, “while data centers are the source of new load, there’s an enormous opportunity to tap VPPs as the new source of grid flexibility.”
Record-breaking heat, droughts, fires, and floods cost the European economy at least 43 billion euros, or $50 billion, a new European Central Bank study found. The research, presented this week to European Union lawmakers, used a model based on weather data and estimates of historical impact of extreme weather on 1,160 different regions across the 27-nation bloc. “The true costs of extreme weather surface slowly because these events affect lives and livelihoods through a wide range of channels that extend beyond the initial impact,” Sehrish Usman, an assistant professor at the University of Mannheim who led the study with two economists from the European Central Bank, told The New York Times.
Secretary of Energy Chris Wright believes nuclear fusion plants will be pumping electricity onto grids no later than 2040. In an interview this week with the BBC while traveling in Europe, Wright said he expected the technology to be commercialized in as little as eight years. “With artificial intelligence and what's going on at the national labs and private companies in the United States, we will have that approach about how to harness fusion energy multiple ways within the next five years," Wright told the broadcaster. “The technology, it'll be on the electric grid, you know, in eight to 15 years.” As Heatmap’s Katie Brigham put it recently, it’s “finally, possibly, almost time for fusion.”