You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
Life cycle analysis has some problems.

About six months ago, a climate scientist from Arizona State University, Stephanie Arcusa, emailed me a provocative new paper she had published that warned against our growing reliance on life cycle analysis. This practice of measuring all of the emissions related to a given product or service throughout every phase of its life — from the time raw materials are extracted to eventual disposal — was going to hinder our ability to achieve net-zero emissions, she wrote. It was a busy time, and I let the message drift to the bottom of my inbox. But I couldn’t stop thinking about it.
Life cycle analysis permeates the climate economy. Businesses rely on it to understand their emissions so they can work toward reducing them. The Securities and Exchange Commission’s climate risk disclosure rule, which requires companies to report their emissions to investors, hinges on it. The clean hydrogen tax credit requires hydrogen producers to do a version of life cycle analysis to prove their eligibility. It is central to carbon markets, and carbon removal companies are now developing standards based on life cycle analysis to “certify” their services as carbon offset developers did before them.
At the same time, many of the fiercest debates in climate change are really debates about life cycle analysis. Should companies be held responsible for the emissions that are indirectly related to their businesses, and if so then which ones? Are carbon offsets a sham? Does using corn ethanol as a gasoline substitute reduce emissions or increase them? Scientists have repeatedly reached opposite conclusions on that one depending on how they accounted for the land required to grow corn and what it might have been used for had ethanol not been an option. Though the debate plays out in calculations, it’s really a philosophical brawl.
Everybody, for the most part, knows that life cycle analysis is difficult and thorny and imprecise. But over and over, experts and critics alike assert that it can be improved. Arcusa disagrees. Life cycle analysis, she says, is fundamentally broken. “It’s a problematic and uncomfortable conclusion to arrive at,” Arcusa wrote in her email. “On the one hand, it has been the only tool we have had to make any progress on climate. On the other, carbon accounting is captured by academia and vested interests and will jeopardize global climate goals.”
When I recently revisited the paper, I learned that Arcusa and her co-authors didn’t just critique life cycle analysis, they proposed a bold alternative. Their idea is not economically or politically easy, but it also doesn’t suffer from the problems of trying to track carbon throughout the supply chain. I recently called her up to talk through it. Our conversation has been edited for clarity.
Can you walk me through what the biggest issues with life cycle analysis are?
So, life cycle analysis is a qualitative tool —
It seems kind of counterintuitive or even controversial to call it a qualitative tool because it’s specifically trying to quantify something.
I think the best analogy for LCA is that it’s a back-of-the-envelope tool. If you really could measure everything, then sure, LCA is this wonderful idea. The problem is in the practicality of being able to collect all of that data. We can’t, and that leads us to use emissions factors and average numbers, and we model this and we model that, and we get so far away from reality that we actually can’t tell if something is positive or negative in the end.
The other problem is that it’s almost entirely subjective, which makes one LCA incomparable to another LCA depending on the context, depending on the technology. And yes, there are some standardization efforts that have been going on for decades. But if you have a ruler, no matter how much you try, it’s not going to become a screwdriver. We’re trying to use this tool to quantify things and make them the same for comparison, and we can’t because of that subjectivity.
In this space where there is a lot of money to be made, it’s very easy to manipulate things one way or another to make it look a little bit better because the method is not robust. That’s really the gist of the problems here.
One of the things you talk about in the paper is the way life cycle analysis is subject to different worldviews. Can you explain that?
It’s mostly seen in what to include or exclude in the LCA — it can have enormous impacts on the results. I think corn ethanol is the perfect example of how tedious this can be because we still don’t have an answer, precisely for that reason. The uncertainty range of the results has shrunk and gotten bigger and shrunk and gotten bigger, and it’s like, well, we still don’t know. And now, this exact same worldview debate is playing into what should be included and not included in certification for things [like carbon removal] that are going to be sold under the guise of climate action, and that just can’t be. We’ll be forever debating whether something is true.
Is this one of those things that scientists have been debating for ever, or is this argument that we should stop using life cycle analysis more of a fringe idea?
I guess I would call it a fringe idea today. There’s been plenty of criticism throughout the years, even from the very beginning when it was first created. What I have seen is that there is criticism, and then there is, “But here’s how we can solve it and continue using LCA!” I’ve only come across one other publication that specifically said, “This is not working. This is not the right tool,” and that’s from Michael Gillenwater. He’s at the Greenhouse Gas Management Institute. He was like, “What are we doing?” There might be other folks, I just haven’t come across them.
Okay, so what is the alternative to LCA that you’ve proposed in this paper?
LCA targets the middle of the supply chain, and tries to attribute responsibility there. But if you think about where on the supply chain the carbon is the most well-known, it is actually at the source, at the point of origin, before it becomes an emission. At the point where it is created out of the ground is where we know how much carbon there is. If we focus on that source through a policy that requires mandatory sequestration — for every ton of carbon that is now produced, there is a ton of carbon that’s been put away through carbon removal, and the accounting happens there, before it is sold to anybody — anybody who’s now downstream of that supply chain is already carbon neutral. There is no need to track carbon all the way down to the consumer.
We know this is accurate because that is where governments already collect royalties and taxes — they want to know exactly how much is being sold. So we already do this. The big difference is that the policy would be required there instead of taxing everybody downstream.
You’re saying that fossil fuel producers should be required to remove a ton of carbon from the atmosphere for every ton of carbon in the fuels they sell?
Yeah, and maybe I should be more specific. They should pay for an equal amount of carbon to be removed from the atmosphere. In no way are we implying that a fossil carbon producer needs to also be doing the sequestration themselves.
What would be the biggest challenges of implementing something like this?
The ultimate challenge is convincing people that we need to be managing carbon and that this is a waste management type of system. Nobody really wants to pay for waste management, and so it needs to be regulated and demanded by some authority.
What about the fact that we don’t really have the ability to remove carbon or store carbon at scale today, and may not for some time?
Yes, we need to build capacity so that eventually we can match the carbon production to the carbon removal, which is why we also proposed that the liability needs to start today, not in the future. That liability is as good as a credit card debt — you actually have to pay it. It can be paid little by little every year, but the liability is here now, and not in the future.
The risk in the system that I’m describing, or even the system that is currently being deployed, is that you have counterproductive technologies that are being developed. And by counterproductive, I mean [carbon removal] technologies that are producing more emissions than they are storing, and so they’re net-positive. You can create a technology that has no intention of removing more carbon than its sequesters. The intention is just to earn money.
Do you mean, like, the things that are supposed to be removing carbon from the atmosphere and sequestering it, they are using fossil fuels to do that, and end up releasing more carbon in the process?
Yeah, so basically, what we show in the paper is that when we get to full carbon neutrality, the market forces alone will eliminate those kinds of technologies that are counterproductive. The problem is during the transition, these technologies can be economically viable because they are cheaper than they would be if 100% of the fossil fuel they used was carbon neutral through carbon removal. And so in order to prevent those technologies from gaming the system, we need a way to artificially make the price of fossil carbon as expensive as it would be if 100% of that fossil carbon was covered by carbon removal.
That’s where the idea of permits comes in. For every amount that I produce, I now have an instant liability, which is a permit. Each of those permits has to be matched by carbon removal. And since we don’t have enough carbon removal, we have futures and these futures represent the promise of actually doing carbon removal.
What if we burn through the remaining carbon budget and we still don’t have the capacity to sequester enough carbon?
Well, then we’re going into very unchartered territory. Right now we’re just mindlessly going through this thinking that if we just reduce emissions it will be good. It won’t be good.
In the paper, you also argue against mitigating greenhouse gases other than carbon, and that seems pretty controversial to me. Why is that?
We’re not arguing against mitigating, per se. We’re arguing against lumping everything under the same carbon accounting framework because lumping hides the difficulty in actually doing something about it. It’s not that we shouldn’t mitigate other greenhouse gases — we must. It’s just that if we separate the problem of carbon away from the problem of methane, away from the problem of nitrous oxide, or CFCs, we can tackle them more effectively. Because right now, we’re trying to do everything under the same umbrella, and that doesn’t work. We don’t tackle drinking and driving by sponsoring better tires. That’s just silly, right? We wouldn’t do that. We would tackle drinking and driving on its own, and then we would tackle better tires in a different policy.
So the argument is: Most of climate change is caused by carbon; let’s tackle that separately from the others and leave tackling methane and nitrous oxide to purposefully created programs to tackle those things. Let’s not lump the calculations altogether, hiding all the differences and hiding meaningful action.
Is there still a role for life cycle analysis?
You don’t want to be regulating carbon using life cycle analysis. So you can use the life cycle analysis for qualitative purposes, but we’re pretending that it is a tool that can deliver accurate results, and it just doesn’t.
What has the response been like to this paper? What kind of feedback have you gotten?
Stunned silence!
Nobody has said anything?
In private, they have. Not in public. In private, it’s been a little bit like, “I’ve always thought this, but it seemed like there was no other way.” But then in public, think about it. Everything is built on LCA. It’s now in every single climate bill out there. Every single standard. Every single consulting company is doing LCA and doing carbon footprinting for companies. It’s a huge industry, so I guess I shouldn’t have been surprised to hear nothing publicly.
Yeah, I was gonna ask — I’ve been writing about the SEC rules and this idea that companies should start reporting their emissions to their investors, and that would all be based on LCA. There’s a lot of buy-in for that idea across the climate movement.
Yeah, but there’s definitely a fine line with make-believe. I think in many instances, we kid ourselves thinking that we’re going to have numbers that we can hang our hats on. In many instances we will not, and they will be challenged. And so at that point, what’s the point?
One thing I hear when I talk to people about this is, well, having an estimate is better than not having anything, or, don’t let the perfect be the enemy of the good, or, we can just keep working to make them better and better. Why not?
I mean, I wouldn’t say don’t try. But when it comes to actually enforcing anything, it’s going to be extremely hard to prove a number. You could just be stuck in litigation for a long time and still not have an answer.
I don’t know, to me it just seems like an endless debate while time is ticking and we will just feel good because we’ll have thought we measured everything. But we’re still not doing anything.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
The fourth-generation gas-cooled reactor company ZettaJoule is setting up shop at an unnamed university.
The appeal of next-generation nuclear technology is simple. Unlike the vast majority of existing reactors that use water, so-called fourth-generation units use coolants such as molten salt, liquid metal, or gases that can withstand intense heat such as helium. That allows the machines to reach and maintain the high temperatures necessary to decarbonize industrial processes, which currently only fossil fuels are able to reach.
But the execution requirements of these advanced reactors are complex, making skepticism easy to understand. While the U.S., Germany, and other countries experimented with fourth-generation reactors in earlier decades, there is only one commercial unit in operation today. That’s in China, arguably the leader in advanced nuclear, which hooked up a demonstration model of a high-temperature gas-cooled reactor to its grid two years ago, and just approved building another project in September.
Then there’s Japan, which has been operating its own high-temperature gas-cooled reactor for 27 years at a government research site in Ibaraki Prefecture, about 90 minutes north of Tokyo by train. Unlike China’s design, it’s not a commercial power reactor. Also unlike China’s design, it’s coming to America.
Heatmap has learned that ZettaJoule, an American-Japanese startup led by engineers who worked on that reactor, is now coming out of stealth and laying plans to build its first plant in Texas.
For months, the company has quietly staffed up its team of American and Japanese executives, including a former U.S. Nuclear Regulatory Commission official and a high-ranking ex-administrator from the industrial giant Mitsubishi. It’s now preparing to decamp from its initial home base in Rockville, Maryland, to the Lone Star State as it prepares to announce its debut project at an as-yet-unnamed university in Texas.
“We haven’t built a nuclear reactor in many, many decades, so you have only a handful of people who experienced the full cycle from design to operations,” Mitsuo Shimofuji, ZettaJoule’s chief executive, told me. “We need to complete this before they retire.”
That’s where the company sees its advantage over rivals in the race to build the West’s first commercial high-temperature gas reactor, such as Amazon-backed X-energy or Canada’s StarCore nuclear. ZettaJoule’s chief nuclear office, Kazuhiko Kunitomi, oversaw the construction of Japan’s research reactor in the 1990s. He’s considered Japan’s leading expert in high-temperature gas reactors.
“Our chief nuclear officer and some of our engineers are the only people in the Western world who have experience of the whole cycle from design to construction to operation of a high temperature gas reactor,” Shimofuji said.
Like X-energy’s reactor, ZettaJoule’s design is a small modular reactor. With a capacity of 30 megawatts of thermal output and 12 megawatts of electricity, the ZettaJoule reactor qualifies as a microreactor, a subcategory of SMR that includes anything 20 megawatts of electricity or less. Both companies’ reactors will also run on TRISO, a special kind of enriched uranium with cladding on each pellet that makes the fuel safer and more efficient at higher temperatures.
While X-energy’s debut project that Amazon is financing in Washington State is a nearly 1-gigawatt power station made up of at least a dozen of the American startup’s 80-megawatt reactors, ZettaJoule isn’t looking to generate electricity.
The first new reactor in Texas will be a research reactor, but the company’s focus is on producing heat. The reactor already working in Japan, which produces heat, demonstrates that the design can reach 950 degrees Celsius, roughly 25% higher than the operating temperature of China’s reactor.
The potential for use in industrial applications has begun to attract corporate partners. In a letter sent Monday to Ted Garrish, the U.S. assistant secretary of energy in charge of nuclear power — a copy of which I obtained — the U.S. subsidiary of the Saudi Arabian oil goliath Aramco urged the Trump administration to support ZettaJoule, and said that it would “consider their application to our operations” as the technology matures. ZettaJoule is in talks with at least two other multinational corporations.
The first new reactor ZettaJoule builds won’t be identical to the unit in Japan, Shimofuji said.
“We are going to modernize this reactor together with the Japanese and U.S. engineering partners,” he said. “The research reactor is robust and solid, but it’s over-engineered. What we want to do is use the safety basis but to make it more economic and competitive.”
Once ZettaJoule proves its ability to build and operate a new unit in Texas, the company will start exporting the technology back to Japan. The microreactor will be its first product line.
“But in the future, we can scale up to 20 times bigger,” Shimofuji said. “We can do 600 megawatts thermal and 300 megawatts electric.”
Another benefit ZettaJoule can tap into is the sweeping deal President Donald Trump brokered with Japanese Prime Minister Sanae Takaichi in October, which included hundreds of billions of dollars for new reactors of varying sizes, including the large-scale Westinghouse AP1000. That included financing to build GE Vernova Hitachi Nuclear Energy’s 300-megawatt BWRX-300, one of the West’s leading third-generation SMRs, which uses a traditional water-cooled design.
Unlike that unit, however, ZettaJoule’s micro-reactor is not a first-of-a-kind technology, said Chris Gadomski, the lead nuclear analyst at the consultancy BloombergNEF.
“It’s operated in Japan for a long, long time,” he told me. “So that second-of-a-kind is an attractive feature. Some of these companies have never operated a reactor. This one has done that.”
A similar dynamic almost played out with large-scale reactors more than two decades ago. In the late 1990s, Japanese developers built four of GE and Hitachi’s ABWR reactor, a large-scale unit with some of the key safety features that make the AP1000 stand out compared to its first- and second-generation predecessors. In the mid 2000s, the U.S. certified the design and planned to build a pair in South Texas. But the project never materialized, and America instead put its resources into Westinghouse’s design.
But the market is different today. Electricity demand is surging in the near term from data centers and in the long term from electrification of cars and industry. The need to curb fossil fuel consumption in the face of worsening climate change is more widely accepted than ever. And China’s growing dominance over nuclear energy has rattled officials from Tokyo to Washington.
“We need to deploy this as soon as possible to not lose the experienced people in Japan and the U.S.,” Shimofuji said. “In two or three years time, we will get a construction permit ideally. We are targeting the early 2030s.”
If every company publicly holding itself to that timeline is successful, the nuclear industry will be a crowded field. But as history shows, those with the experience to actually take a reactor from paper to concrete may have an advantage.
It’s now clear that 2026 will be big for American energy, but it’s going to be incredibly tense.
Over the past 365 days, we at The Fight have closely monitored numerous conflicts over siting and permitting for renewable energy and battery storage projects. As we’ve done so, the data center boom has come into full view, igniting a tinderbox of resentment over land use, local governance and, well, lots more. The future of the U.S. economy and the energy grid may well ride on the outcomes of the very same city council and board of commissioners meetings I’ve been reporting on every day. It’s a scary yet exciting prospect.
To bring us into the new year, I wanted to try something a little different. Readers ask me all the time for advice with questions like, What should I be thinking about right now? And, How do I get this community to support my project? Or my favorite: When will people finally just shut up and let us build things? To try and answer these questions and more, I wanted to give you the top five trends in energy development (and data centers) I’ll be watching next year.
The best thing going for American renewable energy right now is the AI data center boom. But the backlash against developing these projects is spreading incredibly fast.
Do you remember last week when I told you about a national environmental group calling for data center moratoria across the country? On Wednesday, Senator Bernie Sanders called for a nationwide halt to data center construction until regulations are put in place. The next day, the Working Families Party – a progressive third party that fields candidates all over the country for all levels of government – called for its candidates to run in opposition to new data center construction.
On the other end of the political spectrum, major figures in the American right wing have become AI skeptics critical of the nascent data center buildout, including Florida Governor Ron DeSantis, Missouri Senator Josh Hawley, and former Trump adviser Steve Bannon. These figures are clearly following the signals amidst the noise; I have watched in recent months as anti-data center fervor has spread across Facebook, with local community pages and groups once focused on solar and wind projects pivoting instead to focus on data centers in development near them.
In other words, I predicted just one month ago, an anti-data center political movement is forming across the country and quickly gaining steam (ironically aided by the internet and algorithms powered by server farms).
I often hear from the clean energy sector that the data center boom will be a boon for new projects. Renewable energy is the fastest to scale and construct, the thinking goes, and therefore will be the quickest, easiest, and most cost effective way to meet the projected spike in energy demand.
I’m not convinced yet that this line of thinking is correct. But I’m definitely sure that no matter the fuel type, we can expect a lot more transmission development, and nothing sparks a land use fight more easily than new wires.
Past is prologue here. One must look no further than the years-long fight over the Piedmont Reliability Project, a proposed line that would connect a nuclear power plant in Pennsylvania to data centers in Virginia by crossing a large swathe of Maryland agricultural land. I’ve been covering it closely since we put the project in our inaugural list of the most at-risk projects, and the conflict is now a clear blueprint.
In Wisconsin, a billion-dollar transmission project is proving this thesis true. I highly recommend readers pay close attention to Port Washington, where the release of fresh transmission line routes for a massive new data center this week has aided an effort to recall the city’s mayor for supporting the project. And this isn’t even an interstate project like Piedmont.
While I may not be sure of the renewable energy sector’s longer-term benefits from data center development, I’m far more confident that this Big Tech land use backlash is hitting projects right now.
The short-term issue for renewables developers is that opponents of data centers use arguments and tactics similar to those deployed by anti-solar and anti-wind advocates. Everyone fighting data centers is talking about ending development on farmland, avoiding changes to property values, stopping excess noise and water use, and halting irreparable changes to their ways of life.
Only one factor distinguishes data center fights from renewable energy fights: building the former potentially raises energy bills, while the latter will lower energy costs.
I do fear that as data center fights intensify nationwide, communities will not ban or hyper-regulate the server farms in particular, but rather will pass general bans that also block the energy projects that could potentially power them. Rural counties are already enacting moratoria on solar and wind in tandem with data centers – this is not new. But the problem will worsen as conflicts spread, and it will be incumbent upon the myriad environmentalists boosting data center opponents to not accidentally aid those fighting zero-carbon energy.
This week, the Bureau of Land Management approved its first solar project in months: the Libra facility in Nevada. When this happened, I received a flood of enthusiastic and optimistic emails and texts from sources.
We do not yet know whether the Libra approval is a signal of a thaw inside the Trump administration. The Interior Department’s freeze on renewables permitting decisions continues mostly unabated, and I have seen nothing to indicate that more decisions like this are coming down the pike. What we do know is that ahead of a difficult midterm election, the Trump administration faces outsized pressure to do more to address “affordability,” Democrats plan to go after Republicans for effectively repealing the Inflation Reduction Act and halting permits for solar and wind projects, and there’s a grand bargain to be made in Congress over permitting reform that rides on an end to the permitting freeze.
I anticipate that ahead of the election and further permitting talks in Congress, the Trump administration will mildly ease its chokehold on solar and wind permits because that is the most logical option in front of them. I do not think this will change the circumstances for more than a small handful of projects sited on federal lands that were already deep in the permitting process when Trump took power.
It’s impossible to conclude a conversation about next year’s project fights without ending on the theme that defined 2025: battery fire fears are ablaze, and they’ll only intensify as data centers demand excess energy storage capacity.
The January Moss Landing fire incident was a defining moment for an energy sector struggling to grapple with the effects of the Internet age. Despite bearing little resemblance to the litany of BESS proposals across the country, that one hunk of burning battery wreckage in California inspired countless communities nationwide to ban new battery storage outright.
There is no sign this trend will end any time soon. I expect data centers to only accelerate these concerns, as these facilities can also catch fire in ways that are challenging to address.
Plus a resolution for Vineyard Wind and more of the week’s big renewables fights.
1. Hopkins County, Texas – A Dallas-area data center fight pitting developer Vistra against Texas attorney general Ken Paxton has exploded into a full-blown political controversy as the power company now argues the project’s developer had an improper romance with a city official for the host community.
2. La Plata County, Colorado – This county has just voted to extend its moratorium on battery energy storage facilities over fire fears.
3. Dane County, Wisconsin – The city of Madison appears poised to ban data centers for at least a year.
4. Goodhue County, Minnesota – The Minnesota Center for Environmental Advocacy, a large environmentalist organization in the state, is suing to block a data center project in the small city of Pine Island.
5. Hall County, Georgia – A data center has been stopped down South, at least for now.
6. Dukes County, Massachusetts – The fight between Vineyard Wind and the town of Nantucket seems to be over.