You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
It took the market about a week to catch up to the fact that the Chinese artificial intelligence firm DeepSeek had released an open-source AI model that rivaled those from prominent U.S. companies such as OpenAI and Anthropic — and that, most importantly, it had managed to do so much more cheaply and efficiently than its domestic competitors. The news cratered not only tech stocks such as Nvidia, but energy stocks, as well, leading to assumptions that investors thought more-energy efficient AI would reduce energy demand in the sector overall.
But will it really? While some in climate world assumed the same and celebrated the seemingly good news, many venture capitalists, AI proponents, and analysts quickly arrived at essentially the opposite conclusion — that cheaper AI will only lead to greater demand for AI. The resulting unfettered proliferation of the technology across a wide array of industries could thus negate the energy efficiency gains, ultimately leading to a substantial net increase in data center power demand overall.
“With cost destruction comes proliferation,” Susan Su, a climate investor at the venture capital firm Toba Capital, told me. “Plus the fact that it’s open source, I think, is a really, really big deal. It puts the power to expand and to deploy and to proliferate into billions of hands.”
If you’ve seen lots of chitchat about Jevons paradox of late, that’s basically what this line of thinking boils down to. After Microsoft’s CEO Satya Nadella responded to DeepSeek mania by posting the Wikipedia page for this 19th century economic theory on X, many (myself included) got a quick crash course on its origins. The idea is that as technical efficiencies of the Victorian era made burning coal cheaper, demand for — and thus consumption of — coal actually increased.
While this is a distinct possibility in the AI space, it’s by no means a guarantee. “This is very much, I think, an open question,“ energy expert Nat Bullard told me, with regards to whether DeepSeek-type models will spur a reduction or increase in energy demand. “I sort of lean in both directions at once.” Formerly the chief content officer at BloombergNEF and current co-founder of the AI startup Halcyon, a search and information platform for energy professionals, Bullard is personally excited for the greater efficiencies and optionality that new AI models can bring to his business.
But he warns that just because DeepSeek was cheap to train — the company claims it cost about $5.5 million, while domestic models cost hundreds of millions or even billions — doesn’t mean that it’s cheap or energy-efficient to operate. “Training more efficiently does not necessarily mean that you can run it that much more efficiently,” Bullard told me. When a large language model answers a question or provides any type of output, it’s said to be making an “inference.” And as Bullard explains, “That may mean, as we move into an era of more and more inference and not just training, then the [energy] impacts could be rather muted.”
DeepSeek-R1, the name for the model that caused the investor freakout, is also a newer type of LLM that uses more energy in general. Up until literally a few days ago, when OpenAI released o3-mini for free, most casual users were probably interacting with so-called “pretrained” AI models. Fed on gobs of internet text, these LLMs spit out answers based primarily on prediction and pattern recognition. DeepSeek released a model like this, called V3, in September. But last year, more advanced “reasoning” models, which can “think,” in some sense, started blowing up. These models — which include o3-mini, the latest version of Anthropic’s Claude, and the now infamous DeepSeek-R1 — have the ability to try out different strategies to arrive at the correct answer, recognize their mistakes, and improve their outputs, allowing for significant advancements in areas such as math and coding.
But all that artificial reasoning eats up a lot of energy. As Sasha Luccioni, the AI and climate lead at Hugging Face, which makes an open-source platform for AI projects, wrote on LinkedIn, “To set things clear about DeepSeek + sustainability: (it seems that) training is much shorter/cheaper/more efficient than traditional LLMs, *but* inference is longer/more expensive/less efficient because of the chain of thought aspect.” Chain of thought refers to the reasoning process these newer models undertake. Luccioni wrote that she’s currently working to evaluate the energy efficiency of both the DeepSeek V3 and R1 models.
Another factor that could influence energy demand is how fast domestic companies respond to the DeepSeek breakthrough with their own new and improved models. Amy Francetic, co-founder at Buoyant Ventures, doesn’t think we’ll have to wait long. “One effect of DeepSeek is that it will highly motivate all of the large LLMs in the U.S. to go faster,” she told me. And because a lot of the big players are fundamentally constrained by energy availability, she’s crossing her fingers that this means they’ll work smarter, not harder. “Hopefully it causes them to find these similar efficiencies rather than just, you know, pouring more gasoline into a less fuel-efficient vehicle.”
In her recent Substack post, Su described three possible futures when it comes to AI’s role in the clean energy transition. The ideal is that AI demand scales slowly enough that nuclear and renewables scale with it. The least hopeful is that immediate, exponential growth in AI demand leads to a similar expansion of fossil fuels, locking in new dirty infrastructure for decades. “I think that's already been happening,” Su told me. And then there’s the techno-optimist scenario, linked to figures like Sam Altman, which Su doesn’t put much stock in — that AI “drives the energy revolution” by helping to create new energy technologies and efficiencies that more than offset the attendant increase in energy demand.
Which scenario predominates could also depend upon whether greater efficiencies, combined with the adoption of AI by smaller, more shallow-pocketed companies, leads to a change in the scale of data centers. “There’s going to be a lot more people using AI. So maybe that means we don’t need these huge, gigawatt data centers. Maybe we need a lot more smaller, megawatt-size data centers,” Laura Katzman, a principal at Buoyant Ventures, told me. Katzman has conducted research for the firm on data center decarbonization.
Smaller data centers with a subsequently smaller energy footprint could pair well with renewable-powered microgrids, which are less practical and economically feasible for hyperscalers. That could be a big win for solar and wind plus battery storage, Katzman explained, but a boondoggle for companies such as Microsoft, which has famously committed to re-opening Pennsylvania’s Three Mile Island nuclear plant to power its data centers. “Because of DeepSeek, the expected price of compute probably doesn’t justify now turning back on some of these nuclear plants, or these other high-cost energy sources,” Katzman told me.
Lastly, it remains to be seen what nascent applications cheaper models will open up. “If somebody, say, in the Philippines or Vietnam has an interest in applying this to their own decarbonization challenge, what would they come up with?” Bullard pondered. “I don’t yet know what people would do with greater capability and lower costs and a different set of problems to solve for. And that’s really exciting to me.”
But even if the AI pessimists are right, and these newer models don’t make AI ubiquitously useful for applications from new drug discovery to easier regulatory filing, Su told me that in a certain sense, it doesn't matter much. “If there was a possibility that somebody had this type of power, and you could have it too, would you sit on the couch? Or would you arms race them? I think that is going to drive energy demand, irrespective of end utility.”
As Su told me, “I do not think there’s actually a saturation point for this.”
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Ambient Carbon is doing the methane equivalent of point source carbon capture in dairy barns.
In the world of climate and energy, “emissions” is often shorthand for carbon dioxide, the most abundant anthropogenic greenhouse gas in the world. Similarly, talk of emissions capture and removal usually centers on the growing swath of technologies that either prevent CO2 from entering the atmosphere or pull it back out after the fact.
Discussions and frameworks for reducing methane, which is magnitudes more potent than CO2 in the short-term, have been far less common — but the potential impact could be huge.
“If you can accelerate the decrease of methane in the atmosphere, you actually could have a much more significant climate impact, much faster than with CO2,” Gabrielle Dreyfus, chief scientist at the Institute for Governance & Sustainable Development, told me. “People often talk about gigatons of CO2 removal. But because of the potency of methane, for a similar level of temperature impact, you’re talking about megatons.”
Over the past year or so, this conversation has finally started to gain traction. Last October, the National Academies of Sciences, Engineering, and Medicine released a report on atmospheric methane removal, recommending that the U.S. develop a research agenda for methane removal technologies and establish methodologies to assess their impacts. Dreyfus chaired the committee that authored the report.
And one startup, at least — Denmark-based Ambient Carbon — is trying to commercialize its methane-zapping tech. Last week, the company announced that it had successfully trialed its “methane eradication photochemical system” at a dairy barn in Denmark, eliminating the majority of methane from the barn’s air. It’s also aiming to deploy a prototype in the U.S., at a farm in Indiana, by year’s end.
The way the company’s process works is more akin to point source carbon capture, in which emissions are pulled from a smokestack, than it is to something like direct air capture, in which carbon dioxide is removed from ambient air. Inside a dairy barn, cows are continually belching methane, producing high concentrations of the gas that are typically vented into the atmosphere. Instead, Ambient Carbon captures this noxious air from the barn’s ventilation ducts and brings it into an enclosed reactor.
Inside the reactor, which uses electricity from the grid, UV light activates chlorine molecules, splitting their chemical bonds to form unstable radicals. These radicals then react with methane, breaking down the potent gas and converting it into CO2, water, and other byproducts. The whole process mimics the natural destruction of atmospheric methane, which would normally take a decade or more, while Ambient Carbon’s system does it in a matter of seconds. Much of the chlorine gets recycled back into the process, and the CO2 is released into the air.
That might sound less than ideal. Famously, carbon dioxide is bad. This molecule alone is responsible for two-thirds of all human-caused global warming. But because methane is over 80 times as potent as CO2 over a 20-year timeframe, and since it would eventually break down into carbon dioxide in the atmosphere anyway, accelerating that inevitable process turns out to be a net good for the climate.
“The amount of CO2 produced by methane when it oxidizes has about 50 times smaller climate effect than the methane that produced it,” Zeke Hausfather, a climate scientist and climate research lead at Stripe, told me. “So you get a 98% reduction in the warming effects by converting methane to CO2, which I think is a pretty good deal.”
As he sees it, preventing methane emissions in the first place or destroying the molecules before they’re released, as Ambient Carbon is doing, is far more impactful than pursuing after-the-fact atmospheric methane removal. Because while CO2 can linger in the air for centuries — making removal a necessity for near-term planetary cooling — when it comes to methane, “if you cut emissions, you cool the planet pretty quickly, because all that previous warming from methane goes away over the course of a decade or two.”
Agriculture represents 40% of global methane emissions, the largest single source, making the industry a ripe target for de-methane-ization. Ambient Carbon’s tech is only really effective when methane concentrations are relatively high, the company’s CEO, Matthew Johnson, told me — which still leaves a large addressable market given that in many parts of the world, cows are mostly kept in dairy barns, where methane accumulates.
In its trial, Ambient Carbon’s system eliminated up to 90% of dairy barn methane at concentrations ranging from 4.3 parts per million to 44 parts per million. But while the system can theoretically operate at the lower end of that range, Johnson told me it’s only truly energy efficient at 20 parts per million and above. “It’s a question of cost benefit, because we could remove 99% [of the methane from dairy barns] but if you do that, that marginal cost is more energy,” Johnson explained, telling me that the company’s system will likely aim to remove between 80% to 90% of barn methane.
One reason methane destruction and removal technology hasn’t gained much traction is that capturing methane — whether from the atmosphere, a smokestack, or a ventilation duct — is far more challenging than capturing CO2, given that it’s so much less prevalent in the atmosphere. Atmospheric methane is relatively diffuse, with an average concentration of just about 2 parts per million, compared with roughly 420 parts per million for CO2. “I heard the analogy used that if pulling carbon dioxide out of the atmosphere is finding a needle in a haystack, pulling methane out of the atmosphere is pulling dust off the needle in that haystack,” Dreyfus told me.
Because of methane’s relative chemical stability, removing it from the air also requires a strong oxidant, such as chlorine radicals, to break it down. CO2 on the other hand, can be separated from the air with sorbents or membranes, which is a technically simpler process.
Other nascent approaches to methane destruction and removal include introducing chlorine radicals into the open atmosphere and adding soil amendments to boost the effectiveness of natural methane sinks. Among these options, Ambient Carbon’s approach is the furthest along, most well-understood, and likely also lowest-risk. After its successful field trial, “there is not much uncertainty remaining about whether or not this does the claimed thing,” Sam Abernethy, a methane removal scientist at the nonprofit Spark Climate Solutions, told me. “The main questions remaining are whether they can be cost-effective at progressively lower concentrations, whether they can get more methane destroyed per energy input. And that’s something they’ve been improving every year since they started.”
Venture firms have yet to jump onboard though. Thus far, Ambient Carbon’s funding has come from agricultural partners such as Danone North America and Benton Group Dairies, which are working with the company to conduct its field trials. Additional collaboration and financial support comes from organizations such as the Hofmansgave Foundation, a Danish philanthropic group, and Innovation Fund Denmark. Johnson told me the startup also has a number of unnamed angel investors.
Whether or not this tech could ever become efficient enough to tackle more dilute methane emissions — and thus make true atmospheric methane removal feasible — remains highly uncertain. Questions also remain about how these technologies, if proven to be workable, would ultimately be able to scale. For instance, would methane destruction and removal depend more on government policies and regulations, or on market-based incentives?
In the short term, voluntary corporate commitments appear to be the main drivers of interest when it comes to methane destruction specifically. “A lot of food companies have made public pledges that they’re going to reduce their greenhouse gas emissions,” Johnson told me. As he noted, ubiquitous brands such as Kraft Heinz, General Mills, Danone, and Starbucks have all joined the Dairy Methane Action Alliance, which aims to “accelerate action and ambition to drive down methane emissions across dairy supply chains,” according to its website.
The way Ambient Carbon envisions this market working, its food industry partners would be the ones to encourage farms to buy the startup’s methane-destroying units, and would pay farmers a premium for producing low-emissions products. This would enable farmers to cover the system’s cost within five years, and eventually generate additional revenue. Whether the food companies would pass the green premium onto consumers, however, remains to be seen.
But as with the carbon dioxide removal sector, voluntary corporate commitments and carbon crediting schemes will likely only go so far. “Most of what’s going to drive methane elimination is going to be policy,” Hausfather told me. Denmark, where Ambient Carbon conducted its first trial, is set to become the first country in the world to implement a tax on agricultural emissions, starting in 2030. Europe also has a comprehensive greenhouse gas reduction framework, as do states such as California, Washington, and New York.
“It’s such a low-hanging fruit of climate impacts that it’s hard to imagine it’s not going to be regulated pretty substantially in the future,” Hausfather told me. But stringent regulatory requirements are often shaped by the technologies that have been established as effective. And in that sense, what Ambient Carbon is doing today could help pave the way for the ambitious methane targets of tomorrow.
“Moving from a lot of the voluntary pledges that we have towards more mandatory requirements I think is going to have a really important role to play,” Dreyfus told me. “But I think it’s going to be easier if we have more proven technologies to get there.”
On tax credit deadlines, America’s nuclear export hopes, and data center flexibility
Current conditions: Hurricane Erin’s riptides continue lashing the Atlantic Coast, bringing 15-foot waves to the eastern end of New York’s Long Island • In Colorado, the Derby fire tripled in size to more than 2,600 acres, prompting evacuations in the county north of the ski enclave of Aspen • Heavy rain in Sydney set a new 18-year record.
Trump is preparing to onshore turbines, likely shrinking their numbers. Scott Olson/Getty Images
The Trump administration launched an investigation into imported wind turbines and parts, teeing up what Bloomberg called a “potential precursor to adding more tariffs on the clean-energy components.” The Department of Commerce started a national security probe on August 13 to query whether the imports undermine domestic production and put the country at risk from foreign opponents, according to a notice posted Thursday on the agency’s website. The agency already said this week that it would include wind turbines and related parts on the list of products facing 50% steel and aluminum tariffs. As of 2023, at least 41% of wind-related equipment to the U.S. came from Mexico, Canada, and China, according to figures Bloomberg cited from the consultancy Wood Mackenzie.
Also on Thursday, the Treasury Department published an FAQ document outlining the phaseout dates for eight key energy efficiency tax credits repealed under the One Big Beautiful Bill Act. The rules all deal with zero-carbon vehicles or energy efficiency rebates for home improvements.
As Heatmap’s Emily Pontecorvo and Robinson Meyer wrote when the first tranche of data on the programs came out around this time last year, millions of Americans had already taken advantage of at least one of the credits. But the uptake was largely concentrated among households earning $100,000 per year or more.
Get Heatmap AM directly in your inbox every morning:
For years, Westinghouse has been locked in an intellectual property dispute with South Korea’s two state-owned nuclear companies, as the American atomic energy giant accused the Korea Electric Power Corporation and its subsidiary, Korea Hydro & Nuclear Power, of ripping off its reactor technology. This week, the companies brokered a settlement that would keep the Korean giants from bidding on projects in North America, Europe, Japan, the United Kingdom, and Ukraine, effectively eliminating what is arguably the United States’ most capable rival outside of Russia and China from the key markets Washington wants to dominate. That could spur a lot more bids for Westinghouse’s flagship gigawatt-sized AP1000 reactor, projects for which are already underway in Poland, Slovakia, and Ukraine. But KoreaPro reported on Thursday that South Korea is pushing back on a deal Seoul fears infringes on its sovereignty.
In Sweden, meanwhile, the U.S.-Japanese joint venture GE Vernova-Hitachi Nuclear Energy secured a new deal to build its 300-megawatt small modular reactor that the government in Stockholm explicitly pitched as a bid to strengthen its trans-Atlantic security ties. “This is the beginning of something bigger, in many ways,” Ebba Busch, Sweden’s deputy prime minister, wrote in a post on LinkedIn. “As in the NATO process, Sweden is part of a larger movement.”
The Department of Energy extended its emergency order directing the J.H. Campbell Generating Plant in Michigan to remain open past its planned retirement. Secretary of Energy Chris Wright initially ordered the 1,420-megawatt coal station to stay online three months past its May 31 shutdown date, citing risks of electricity shortages in the Midcontinent Independent System Operator, the electrical grid that runs from the Upper Midwest down to Louisiana. Starting Thursday, the latest order directs the plant’s owners to keep the station running November 19. The consultancy Grid Strategies estimated last week that if the Trump administration expands the effort to cover all 54 aging fossil fuel plants slated for closure between now and 2028, the program will cost upward of $6 billion. Last week, the Federal Energy Regulatory Commission approved a framework for the utilities that own the affected plants to recoup the costs of operating the power stations past the closure dates from ratepayers, despite surging electricity prices.
The Data Center Coalition, a leading trade association representing the burgeoning server farm industry, has endorsed adopting programs to curb electricity demand when the grid is under stress. In a filing Thursday with the North Carolina Utility Commission, the industry group said it “supports exploring well-structured, voluntary demand-response and load flexibility programs for large load customers that allocates risk appropriately, provides clear incentives and compensation, and allows customers to meet their sustainability commitments.”
Researchers at Duke University put out an influential paper in February that found the U.S. could add gigawatts of additional demand from new data centers without building out an equivalent amount of generating plants if those facilities could curtail power usage when demand was particularly high. Heatmap’s Matthew Zeitlin described the strategy as “one weird trick for getting more data centers on the grid,” boiling down the approach simply as: “Just turn them off sometimes.” When I interviewed Tyler Norris, the study’s lead author, he pitched the idea as a way “to buy us some time” to figure out exactly how much electricity the artificial intelligence boom requires before we build out a bunch of gas plants that are even more expensive than usual due to the years-long backorder of turbines.
Researchers at the University of Houston claim to have made two major breakthroughs in carbon capture technology. The first breakthrough, published in the journal Nature Communications, introduces a new electrochemical process for filtering out carbon dioxide that avoids using a membrane like traditional carbon capture technology. The second, featured on the cover of the journal ES&T Engineering, demonstrates a new vanadium-based flow battery that could be used both to capture carbon and to store renewable energy. “We need solutions, and we wanted to be part of the solution. The biggest suspect out there is CO2 emissions, so the low-hanging fruit would be to eliminate those emissions,” Mim Rahimi, a professor at the University of Houston’s Cullen College of Engineering, said in a statement. “From membraneless systems to scalable flow systems, we’re charting pathways to decarbonize hard-to-abate sectors and support the transition to a low-carbon economy.”
A conversation with Scott Cockerham of Latham and Watkins.
This week’s conversation is with Scott Cockerham, a partner with the law firm Latham and Watkins whose expertise I sought to help me best understand the Treasury Department’s recent guidance on the federal solar and wind tax credits. We focused on something you’ve probably been thinking about a lot: how to qualify for the “start construction” part of the new tax regime, which is the primary hurdle for anyone still in the thicket of a fight with local opposition.
The following is our chat lightly edited for clarity. Enjoy.
So can you explain what we’re looking at here with the guidance and its approach to what it considers the beginning of construction?
One of the reasons for the guidance was a distinction in the final version of the bill that treated wind and solar differently for purposes of tax credit phase-outs. They landed on those types of assets being placed in service by the end of 2027, or construction having to begin within 12 months of enactment – by July 4th, 2026. But as part of the final package, the Trump administration promised the House Freedom Caucus members they would tighten up what it means to ‘start construction’ for solar and wind assets in particular.
In terms of changes, probably the biggest difference is that for projects over 1.5 megawatts of output, you can no longer use a “5% safe harbor” to qualify projects. The 5% safe harbor was a construct in prior start of construction guidance saying you could begin construction by incurring 5% of your project cost. That will no longer be available for larger projects. Residential projects and other smaller solar projects will still have that available to them. But that is probably the biggest change.
The other avenue to start construction is called the “physical work test,” which requires the commencement of physical work of a significant nature. The work can either be performed on-site or it can be performed off-site by a vendor. The new guidance largely parrotted those rules from prior guidance and in many cases transferred the concepts word-for-word. So on the physical work side, not much changed.
Significantly, there’s another aspect of these rules that say you have to continue work once you start. It’s like asking if you really ran a race if you didn’t keep going to the finish line. Helpfully, the new guidance retains an old rule saying that you’re assumed to have worked continuously if you place in service within four calendar years after the year work began. So if you begin in 2025 you have until the end of 2029 to place in service without having to prove continuous work. There had been rumors about that four-year window being shortened, so the fact that it was retained is very helpful to project pipelines.
The other major point I’d highlight is that the effective date of the new guidance is September 2. There’s still a limited window between now and then to continue to access the old rules. This also provides greater certainty for developers who attempted to start construction under the old rules after July 4, 2025. They can be confident that what they did still works assuming it was consistent with the prior guidance.
On the construction start – what kinds of projects would’ve maybe opted to use the 5% cost metric before?
Generally speaking it has mostly been distributed generation and residential solar projects. On the utility scale side it had recently tended to be projects buying domestic modules where there might have been an angle to access the domestic content tax credit bonus as well.
For larger projects, the 5% test can be quite expensive. If you’re a 200-megawatt project, 5% of your project is not nothing – that actually can be quite high. I would say probably the majority of utility scale projects in recent years had relied on the manufacturing of transformers as the primary strategy.
So now that option is not available to utility scale projects anymore?
The domestic content bonus is still available, but prior to September 2 you can procure modules for a large project and potentially both begin construction and qualify for the domestic content bonus at the same time. Beginning September 2 the module procurement wouldn’t help that same project begin construction.
Okay, so help me understand what kinds of work will developers need to do in order to pass the physical work test here?
A lot of it is market-driven by preferences from tax equity investors and tax credit buyers and their tax counsel. Over the last 8 years or so transformer manufacturing has become quite popular. I expect that to continue to be an avenue people will pursue. Another avenue we see quite often is on-site physical work, so for a wind project for example that can involve digging foundations for your wind turbines, covering them with concrete slabs, and doing work for something called string roads – roads that go between your turbines primarily for operations and maintenance. On the solar side, it would be similar kinds of on-site work: foundation work, road work, driving piles, putting things up at the site.
One of the things that is more difficult about the physical work test as opposed to the 5% test is that it is subjective. I always tell people that more work is always better. In the first instance it’s likely up to whatever your financing party thinks is enough and that’s going to be a project-specific determination, typically.
Okay, and how much will permitting be a factor in passing the physical work test?
It depends. It can certainly affect on-site work if you don’t have access to the site yet. That is obviously problematic.
But it wouldn’t prevent you from doing an off-site physical work strategy. That would involve procuring a non-inventory item like a transformer for the project. So there are still different things you can do depending on the facts.
What’s your ultimate takeaway on the Treasury guidance overall?
It certainly makes beginning construction on wind and solar more difficult, but I think the overall reaction that I and others in the market have mostly had is that the guidance came out much better than people feared. There were a lot of rumors going around about things that could have been really problematic, but for the most part, other than the 5% test option going away, the sense is that not a whole lot changed. This is a positive result on the development side.