You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
Life cycle analysis has some problems.

About six months ago, a climate scientist from Arizona State University, Stephanie Arcusa, emailed me a provocative new paper she had published that warned against our growing reliance on life cycle analysis. This practice of measuring all of the emissions related to a given product or service throughout every phase of its life — from the time raw materials are extracted to eventual disposal — was going to hinder our ability to achieve net-zero emissions, she wrote. It was a busy time, and I let the message drift to the bottom of my inbox. But I couldn’t stop thinking about it.
Life cycle analysis permeates the climate economy. Businesses rely on it to understand their emissions so they can work toward reducing them. The Securities and Exchange Commission’s climate risk disclosure rule, which requires companies to report their emissions to investors, hinges on it. The clean hydrogen tax credit requires hydrogen producers to do a version of life cycle analysis to prove their eligibility. It is central to carbon markets, and carbon removal companies are now developing standards based on life cycle analysis to “certify” their services as carbon offset developers did before them.
At the same time, many of the fiercest debates in climate change are really debates about life cycle analysis. Should companies be held responsible for the emissions that are indirectly related to their businesses, and if so then which ones? Are carbon offsets a sham? Does using corn ethanol as a gasoline substitute reduce emissions or increase them? Scientists have repeatedly reached opposite conclusions on that one depending on how they accounted for the land required to grow corn and what it might have been used for had ethanol not been an option. Though the debate plays out in calculations, it’s really a philosophical brawl.
Everybody, for the most part, knows that life cycle analysis is difficult and thorny and imprecise. But over and over, experts and critics alike assert that it can be improved. Arcusa disagrees. Life cycle analysis, she says, is fundamentally broken. “It’s a problematic and uncomfortable conclusion to arrive at,” Arcusa wrote in her email. “On the one hand, it has been the only tool we have had to make any progress on climate. On the other, carbon accounting is captured by academia and vested interests and will jeopardize global climate goals.”
When I recently revisited the paper, I learned that Arcusa and her co-authors didn’t just critique life cycle analysis, they proposed a bold alternative. Their idea is not economically or politically easy, but it also doesn’t suffer from the problems of trying to track carbon throughout the supply chain. I recently called her up to talk through it. Our conversation has been edited for clarity.
Can you walk me through what the biggest issues with life cycle analysis are?
So, life cycle analysis is a qualitative tool —
It seems kind of counterintuitive or even controversial to call it a qualitative tool because it’s specifically trying to quantify something.
I think the best analogy for LCA is that it’s a back-of-the-envelope tool. If you really could measure everything, then sure, LCA is this wonderful idea. The problem is in the practicality of being able to collect all of that data. We can’t, and that leads us to use emissions factors and average numbers, and we model this and we model that, and we get so far away from reality that we actually can’t tell if something is positive or negative in the end.
The other problem is that it’s almost entirely subjective, which makes one LCA incomparable to another LCA depending on the context, depending on the technology. And yes, there are some standardization efforts that have been going on for decades. But if you have a ruler, no matter how much you try, it’s not going to become a screwdriver. We’re trying to use this tool to quantify things and make them the same for comparison, and we can’t because of that subjectivity.
In this space where there is a lot of money to be made, it’s very easy to manipulate things one way or another to make it look a little bit better because the method is not robust. That’s really the gist of the problems here.
One of the things you talk about in the paper is the way life cycle analysis is subject to different worldviews. Can you explain that?
It’s mostly seen in what to include or exclude in the LCA — it can have enormous impacts on the results. I think corn ethanol is the perfect example of how tedious this can be because we still don’t have an answer, precisely for that reason. The uncertainty range of the results has shrunk and gotten bigger and shrunk and gotten bigger, and it’s like, well, we still don’t know. And now, this exact same worldview debate is playing into what should be included and not included in certification for things [like carbon removal] that are going to be sold under the guise of climate action, and that just can’t be. We’ll be forever debating whether something is true.
Is this one of those things that scientists have been debating for ever, or is this argument that we should stop using life cycle analysis more of a fringe idea?
I guess I would call it a fringe idea today. There’s been plenty of criticism throughout the years, even from the very beginning when it was first created. What I have seen is that there is criticism, and then there is, “But here’s how we can solve it and continue using LCA!” I’ve only come across one other publication that specifically said, “This is not working. This is not the right tool,” and that’s from Michael Gillenwater. He’s at the Greenhouse Gas Management Institute. He was like, “What are we doing?” There might be other folks, I just haven’t come across them.
Okay, so what is the alternative to LCA that you’ve proposed in this paper?
LCA targets the middle of the supply chain, and tries to attribute responsibility there. But if you think about where on the supply chain the carbon is the most well-known, it is actually at the source, at the point of origin, before it becomes an emission. At the point where it is created out of the ground is where we know how much carbon there is. If we focus on that source through a policy that requires mandatory sequestration — for every ton of carbon that is now produced, there is a ton of carbon that’s been put away through carbon removal, and the accounting happens there, before it is sold to anybody — anybody who’s now downstream of that supply chain is already carbon neutral. There is no need to track carbon all the way down to the consumer.
We know this is accurate because that is where governments already collect royalties and taxes — they want to know exactly how much is being sold. So we already do this. The big difference is that the policy would be required there instead of taxing everybody downstream.
You’re saying that fossil fuel producers should be required to remove a ton of carbon from the atmosphere for every ton of carbon in the fuels they sell?
Yeah, and maybe I should be more specific. They should pay for an equal amount of carbon to be removed from the atmosphere. In no way are we implying that a fossil carbon producer needs to also be doing the sequestration themselves.
What would be the biggest challenges of implementing something like this?
The ultimate challenge is convincing people that we need to be managing carbon and that this is a waste management type of system. Nobody really wants to pay for waste management, and so it needs to be regulated and demanded by some authority.
What about the fact that we don’t really have the ability to remove carbon or store carbon at scale today, and may not for some time?
Yes, we need to build capacity so that eventually we can match the carbon production to the carbon removal, which is why we also proposed that the liability needs to start today, not in the future. That liability is as good as a credit card debt — you actually have to pay it. It can be paid little by little every year, but the liability is here now, and not in the future.
The risk in the system that I’m describing, or even the system that is currently being deployed, is that you have counterproductive technologies that are being developed. And by counterproductive, I mean [carbon removal] technologies that are producing more emissions than they are storing, and so they’re net-positive. You can create a technology that has no intention of removing more carbon than its sequesters. The intention is just to earn money.
Do you mean, like, the things that are supposed to be removing carbon from the atmosphere and sequestering it, they are using fossil fuels to do that, and end up releasing more carbon in the process?
Yeah, so basically, what we show in the paper is that when we get to full carbon neutrality, the market forces alone will eliminate those kinds of technologies that are counterproductive. The problem is during the transition, these technologies can be economically viable because they are cheaper than they would be if 100% of the fossil fuel they used was carbon neutral through carbon removal. And so in order to prevent those technologies from gaming the system, we need a way to artificially make the price of fossil carbon as expensive as it would be if 100% of that fossil carbon was covered by carbon removal.
That’s where the idea of permits comes in. For every amount that I produce, I now have an instant liability, which is a permit. Each of those permits has to be matched by carbon removal. And since we don’t have enough carbon removal, we have futures and these futures represent the promise of actually doing carbon removal.
What if we burn through the remaining carbon budget and we still don’t have the capacity to sequester enough carbon?
Well, then we’re going into very unchartered territory. Right now we’re just mindlessly going through this thinking that if we just reduce emissions it will be good. It won’t be good.
In the paper, you also argue against mitigating greenhouse gases other than carbon, and that seems pretty controversial to me. Why is that?
We’re not arguing against mitigating, per se. We’re arguing against lumping everything under the same carbon accounting framework because lumping hides the difficulty in actually doing something about it. It’s not that we shouldn’t mitigate other greenhouse gases — we must. It’s just that if we separate the problem of carbon away from the problem of methane, away from the problem of nitrous oxide, or CFCs, we can tackle them more effectively. Because right now, we’re trying to do everything under the same umbrella, and that doesn’t work. We don’t tackle drinking and driving by sponsoring better tires. That’s just silly, right? We wouldn’t do that. We would tackle drinking and driving on its own, and then we would tackle better tires in a different policy.
So the argument is: Most of climate change is caused by carbon; let’s tackle that separately from the others and leave tackling methane and nitrous oxide to purposefully created programs to tackle those things. Let’s not lump the calculations altogether, hiding all the differences and hiding meaningful action.
Is there still a role for life cycle analysis?
You don’t want to be regulating carbon using life cycle analysis. So you can use the life cycle analysis for qualitative purposes, but we’re pretending that it is a tool that can deliver accurate results, and it just doesn’t.
What has the response been like to this paper? What kind of feedback have you gotten?
Stunned silence!
Nobody has said anything?
In private, they have. Not in public. In private, it’s been a little bit like, “I’ve always thought this, but it seemed like there was no other way.” But then in public, think about it. Everything is built on LCA. It’s now in every single climate bill out there. Every single standard. Every single consulting company is doing LCA and doing carbon footprinting for companies. It’s a huge industry, so I guess I shouldn’t have been surprised to hear nothing publicly.
Yeah, I was gonna ask — I’ve been writing about the SEC rules and this idea that companies should start reporting their emissions to their investors, and that would all be based on LCA. There’s a lot of buy-in for that idea across the climate movement.
Yeah, but there’s definitely a fine line with make-believe. I think in many instances, we kid ourselves thinking that we’re going to have numbers that we can hang our hats on. In many instances we will not, and they will be challenged. And so at that point, what’s the point?
One thing I hear when I talk to people about this is, well, having an estimate is better than not having anything, or, don’t let the perfect be the enemy of the good, or, we can just keep working to make them better and better. Why not?
I mean, I wouldn’t say don’t try. But when it comes to actually enforcing anything, it’s going to be extremely hard to prove a number. You could just be stuck in litigation for a long time and still not have an answer.
I don’t know, to me it just seems like an endless debate while time is ticking and we will just feel good because we’ll have thought we measured everything. But we’re still not doing anything.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
The proportion of voters who strongly oppose development grew by nearly 50%.
During his State of the Union address Tuesday night, President Donald Trump attempted to stanch the public’s bleeding support for building the data centers his administration says are necessary to beat China in the artificial intelligence race. With “many Americans” now “concerned that energy demand from AI data centers could unfairly drive up their electricity bills,” Trump said, he pledged to make major tech companies pay for new power plants to supply electricity to data centers.
New polling from energy intelligence platform Heatmap Pro shows just how dramatically and swiftly American voters are turning against data centers.
Earlier this month, the survey, conducted by Embold Research, reached out to 2,091 registered voters across the country, explaining that “data centers are facilities that house the servers that power the internet, apps, and artificial intelligence” and asking them, “Would you support or oppose a data center being built near where you live?” Just 28% said they would support or strongly support such a facility in their neighborhood, while 52% said they would oppose or strongly oppose it. That’s a net support of -24%.
When Heatmap Pro asked a national sample of voters the same question last fall, net support came out to +2%, with 44% in support and 42% opposed.
The steep drop highlights a phenomenon Heatmap’s Jael Holzman described last fall — that data centers are "swallowing American politics,” as she put it, uniting conservation-minded factions of the left with anti-renewables activists on the right in opposing a common enemy.
The results of this latest Heatmap Pro poll aren’t an outlier, either. Poll after poll shows surging public antipathy toward data centers as populists at both ends of the political spectrum stoke outrage over rising electricity prices and tech giants struggle to coalesce around a single explanation of their impacts on the grid.
“The hyperscalers have fumbled the comms game here,” Emmet Penney, an energy researcher and senior fellow at the right-leaning Foundation for American Innovation, told me.
A historian of the nuclear power sector, Penney sees parallels between the grassroots pushback to data centers and the 20th century movement to stymie construction of atomic power stations across the Western world. In both cases, opponents fixated on and popularized environmental criticisms that were ultimately deemed minor relative to the benefits of the technology — production of radioactive waste in the case of nuclear plants, and as seems increasingly clear, water usage in the case of data centers.
Likewise, opponents to nuclear power saw urgent efforts to build out the technology in the face of Cold War competition with the Soviet Union as more reason for skepticism about safety. Ditto the current rhetoric on China.
Penney said that both data centers and nuclear power stoke a “fear of bigness.”
“Data centers represent a loss of control over everyday life because artificial intelligence means change,” he said. “The same is true about nuclear,” which reached its peak of expansion right as electric appliances such as dishwashers and washing machines were revolutionizing domestic life in American households.
One of the more fascinating findings of the Heatmap Pro poll is a stark urban-rural divide within the Republican Party. Net support for data centers among GOP voters who live in suburbs or cities came out to -8%. Opposition among rural Republicans was twice as deep, at -20%. While rural Democrats and independents showed more skepticism of data centers than their urbanite fellow partisans, the gap was far smaller.
That could represent a challenge for the Trump administration.
“People in the city are used to a certain level of dynamism baked into their lives just by sheer population density,” Penney said. “If you’re in a rural place, any change stands out.”
Senator Bernie Sanders, the democratic socialist from Vermont, has championed legislation to place a temporary ban on new data centers. Such a move would not be without precedent; Ireland, transformed by tax-haven policies over the past two decades into a hub for Silicon Valley’s giants, only just ended its de facto three-year moratorium on hooking up data centers to the grid.
Senator Josh Hawley, the Missouri Republican firebrand, proposed his own bill that would force data centers off the grid by requiring the complexes to build their own power plants, much as Trump is now promoting.
On the opposite end of the spectrum, you have Republicans such as Mississippi Governor Tate Reeves, who on Tuesday compared halting construction of data centers to “civilizational suicide.”
“I am tempted to sit back and let other states fritter away the generational chance to build. To laugh at their short-sightedness,” he wrote in a post on X. “But the best path for all of us would be to see America dominate, because our foes are not like us. They don’t believe in order, except brutal order under their heels. They don’t believe in prosperity, except for that gained through fraud and plunder. They don’t think or act in a way I can respect as an American.”
Then you have the actual hyperscalers taking opposite tacks. Amazon Web Services, for example, is playing offense, promoting research that shows its data centers are not increasing electricity rates. Claude-maker Anthropic, meanwhile, issued a de facto mea culpa, pledging earlier this month to offset all its electricity use.
Amid that scattershot messaging, the critical rhetoric appears to be striking its targets. Whether Trump’s efforts to curb data centers’ impact on the grid or Reeves’ stirring call to patriotic sacrifice can reverse cratering support for the buildout remains to be seen. The clock is ticking. There are just 36 weeks until the midterm Election Day.
The public-private project aims to help realize the president’s goal of building 10 new reactors by 2030.
The Department of Energy and the Westinghouse Electric Company have begun meeting with utilities and nuclear developers as part of a new project aimed at spurring the country’s largest buildout of new nuclear power plants in more than 30 years, according to two people who have been briefed on the plans.
The discussions suggest that the Trump administration’s ambitious plans to build a fleet of new nuclear reactors are moving forward at least in part through the Energy Department. President Trump set a goal last year of placing 10 new reactors under construction nationwide by 2030.
The project aims to purchase the parts for 8 gigawatts to 10 gigawatts of new nuclear reactors, the people said. The reactors would almost certainly be AP1000s, a third-generation reactor produced by Westinghouse capable of producing up to 1.1 gigawatts of electricity per unit.
The AP1000 is the only third-generation reactor successfully deployed in the United States. Two AP1000 reactors were completed — and powered on — at Plant Vogtle in eastern Georgia earlier this decade. Fifteen other units are operating or under construction worldwide.
Representatives from Westinghouse and the Energy Department did not respond to requests for comment.
The project would use government and private financing to buy advanced reactor equipment that requires particularly long lead times, the people said. It would seek to lower the cost of the reactors by placing what would essentially be a single bulk order for some of their parts, allowing Westinghouse to invest in and scale its production efforts. It could also speed up construction timelines for the plants themselves.
The department is in talks with four to five potential partners, including utilities, independent power producers, and nuclear development companies, about joining the project. Under the plan, these utilities or developers would agree to purchase parts for two new reactors each. The program would be handled in part by the department’s in-house bank, the Loan Programs Office, which the Trump administration has dubbed the Office of Energy Dominance Financing.
This fleet-based approach to nuclear construction has succeeded in the past. After the oil crisis struck France in the 1970s, the national government responded by planning more than three-dozen reactors in roughly a decade, allowing the country to build them quickly and at low cost. France still has some of the world’s lowest-carbon electricity.
By comparison, the United States has built three new nuclear reactors, totaling roughly 3.5 gigawatts of capacity, since the year 2000, and it has not significantly expanded its nuclear fleet since 1990. The Trump administration set a goal in May to quadruple total nuclear energy production — which stands at roughly 100 gigawatts today — to more than 400 gigawatts by the middle of the century.
The Trump administration and congressional Republicans have periodically announced plans to expand the nuclear fleet over the past year, although details on its projects have been scant.
Senator Dave McCormick, a Republican of Pennsylvania, announced at an energy summit last July that Westinghouse was moving forward with plans to build 10 new reactors nationwide by 2030.
In October, Commerce Secretary Howard Lutnick announced a new deal between the U.S. government, the private equity firm Brookfield Asset Management, and the uranium company Cameco to deploy $80 billion in new Westinghouse reactors across the United States. (A Brookfield subsidiary and Cameco have jointly owned Westinghouse since it went bankrupt in 2017 due to construction cost overruns.) Reuters reported last month that this deal aimed to satisfy the Trump administration’s 2030 goal.
While there have been other Republican attempts to expand the nuclear fleet over the years, rising electricity demand and the boom in artificial intelligence data centers have brought new focus to the issue. This time, Democratic politicians have announced their own plans to boost nuclear power in their states.
In January, New York Governor Kathy Hochul set a goal of building 4 gigawatts of new nuclear power plants in the Empire State.
In his State of the State address, Governor JB Pritzker of Illinois told lawmakers last week that he hopes to see at least 2 gigawatts of new nuclear power capacity operating in his state by 2033.
Meeting Trump’s nuclear ambitions has been a source of contention between federal agencies. Politico reported on Thursday that the Energy Department had spent months negotiating a nuclear strategy with Westinghouse last year when Lutnick inserted himself directly into negotiations with the company. Soon after, the Commerce Department issued an announcement for the $80 billion megadeal, which was big on hype but short on details.
The announcement threw a wrench in the Energy Department’s plans, but the agency now seems to have returned to the table. According to Politico, it is now also “engaging” with GE Hitachi, another provider of advanced nuclear reactors.
On nuclear tax credits, BLM controversy, and a fusion maverick’s fundraise
Current conditions: A third storm could dust New York City and the surrounding area with more snow • Floods and landslides have killed at least 25 people in Brazil’s southeastern state of Minas Gerais • A heat dome in Western Europe is pushing up temperatures in parts of Portugal, Spain, and France as high as 15 degrees Celsius above average.

The Department of Energy’s in-house lender, the Loan Programs Office — dubbed the Office of Energy Dominance Financing by the Trump administration — just gave out the largest loan in its history to Southern Company. The nearly $27 billion loan will “build or upgrade over 16 gigawatts of firm reliable power,” including 5 gigawatts of new gas generation, 6 gigawatts of uprates and license renewals for six different reactors, and more than 1,300 miles of transmission and grid enhancement projects. In total, the package will “deliver $7 billion in electricity cost savings” to millions of ratepayers in Georgia and Alabama by reducing the utility giant’s interest expenses by over $300 million per year. “These loans will not only lower energy costs but also create thousands of jobs and increase grid reliability for the people of Georgia and Alabama,” Secretary of Energy Chris Wright said in a statement.
Over in Utah, meanwhile, the state government is seeking the authority to speed up its own deployment of nuclear reactors as electricity demand surges in the desert state. In a letter to the Nuclear Regulatory Commission dated November 10 — but which E&E News published this week — Tim Davis, the executive director of Utah’s Department of Environmental Quality, requested that the federal agency consider granting the state the power to oversee uranium enrichment, microreactor licensing, fuel storage, and reprocessing on its own. All of those sectors fall under the NRC’s exclusive purview. At least one program at the NRC grants states limited regulatory primacy for some low-level radiological material. While there’s no precedent for a transfer of power as significant as what Utah is requesting, the current administration is upending norms at the NRC more than any other government since the agency’s founding in 1975.
Building a new nuclear plant on a previously undeveloped site is already a steep challenge in electricity markets such as New York, California, or the Midwest, which broke up monopoly utilities in the 1990s and created competitive auctions that make decade-long, multibillion-dollar reactors all but impossible to finance. A growing chorus argues, as Heatmap’s Matthew Zeitlin wrote, that these markets “are no longer working.” Even in markets with vertically-integrated power companies, the federal tax credits meant to spur construction of new reactors would make financing a greenfield plant is just as impossible, despite federal tax credits meant to spur construction of new reactors. That’s the conclusion of a new analysis by a trio of government finance researchers at the Center for Public Enterprise. The investment tax credit, “large as it is, cannot easily provide them with upfront construction-period support,” the report found. “The ITC is essential to nuclear project economics, but monetizing it during construction poses distinct challenges for nuclear developers that do not arise for renewable energy projects. Absent a public agency’s ability to leverage access to the elective payment of tax credits, it is challenging to see a path forward for attracting sufficient risk capital for a new nuclear project under the current circumstances.”
Steve Pearce, Trump’s pick to lead the Department of the Interior’s Bureau of Land Management, wavered when asked about his record of pushing to sell off federal lands during his nomination hearing Wednesday. A former Republican lawmaker from New Mexico, Pearce has faced what the public lands news site Public Domain called “broad backlash from environmental, conservation, and hunting groups for his record of working to undermine public land protections and push land sales as a way to reduce the federal deficit.” Faced with questions from Democratic senators, Pearce said, “I’m not so sure that I’ve changed,” but insisted he didn’t “believe that we’re going to go out and wholesale land from the federal government.” That has, however, been the plan since the start of the administration. As Heatmap’s Jeva Lange wrote last year, Republicans looked poised to use their trifecta to sell off some of the approximately 640 million acres of land the federal government owns.
Sign up to receive Heatmap AM in your inbox every morning:
At Tuesday’s State of the Union address, as I told you yesterday, Trump vowed to force major data center companies to build, bring, or buy their own power plants to keep the artificial intelligence boom from driving up electricity prices. On Wednesday, Fox News reported that Amazon, Google, Meta, Microsoft, xAI, Oracle, and OpenAI planned to come to the White House to sign onto the deal. The meeting is set to take place sometime next month. Data centers are facing mounting backlash. Developers abandoned at least 25 data centers last year amid mounting pushback from local opponents, Heatmap's Robinson Meyer recently reported.
Shine Technologies is a rare fusion company that’s actually making money today. That’s because the Wisconsin-based firm uses its plasma beam fusion technology to produce isotopes for testing and medical therapies. Next, the company plans to start recycling nuclear waste for fresh reactor fuel. To get there, Shine Technologies has raised $240 million to fund its efforts for the next few years, as I reported this morning in an exclusive for Heatmap. Nearly 63% of the funding came from biotech billionaire Patrick Soon-Shiong, who will join the board. The capital will carry the company through the launch of the world’s largest medical isotope producer and lay the foundations of a new business recycling nuclear waste in the early 2030s that essentially just reorders its existing assembly line.
Vineyard Wind is nearly complete. As of Wednesday, 60 of the project’s 62 turbines have been installed off the coast of Massachusetts. Of those, E&E News reported, 52 have been cleared to start producing power. The developer Iberdrola said the final two turbines may be installed in the next few days. “For me, as an engineer, the farm is already completed,” Iberdrola’s executive chair, Ignacio Sánchez Galán, told analysts on an earnings call. “I think these numbers mean the level of availability is similar for other offshore wind farms we have in operation. So for me, that is completed.”