You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
A new report demonstrates how to power the computing boom with (mostly) clean energy.
After a year of concerted hand-wringing about the growing energy needs of data centers, a report that dropped just before the holidays proposed a solution that had been strangely absent from the discussion.
AI companies have seemingly grasped for every imaginable source of clean energy to quench their thirst for power, including pricey, left-field ideas like restarting shuttered nuclear plants. Some are foregoing climate concerns altogether and ordering up off-grid natural gas turbines. In a pithily named new analysis — “Fast, scalable, clean, and cheap enough” — the report’s authors make a compelling case for an alternative: off-grid solar microgrids.
An off-grid solar microgrid is a system with solar panels, batteries, and small gas generators that can work together to power a data center directly without connecting to the wider electricity system. It can have infinite possible configurations, such as greater or smaller numbers of solar panels, and more or less gas-generated capacity. The report models the full range of possibilities to illustrate the trade-offs in terms of emission reductions and cost.
An eclectic group of experts got together to do the research, including staffers from the payment company Stripe, a developer called Scale Microgrids, and Paces, which builds software to help renewable energy developers identify viable sites for projects. They found that an off-grid microgrid that supplied 44% of a data center’s demand from solar panels and used a natural gas generator the rest of the time would cost roughly $93 per megawatt-hour compared to about $86 for large, off-grid natural gas turbines — and it would emit nearly one million tons of CO2 less than the gas turbines. A cleaner system that produced 90% of its power from solar and batteries would cost closer to $109 per megawatt-hour, the authors found. While that’s more expensive than gas turbines, it’s significantly cheaper than repowering Three Mile Island, the fabled nuclear plant that Microsoft is bringing back online for an estimated $130 per megawatt-hour.
One challenge with solar microgrids is that they require a lot of land for solar panels. But a geospatial analysis showed that there’s more than enough available land in the U.S. southwest — primarily in West Texas — to cover estimated energy demand growth from data centers through 2030. This shouldn’t be taken as a recommendation, per se. The paper doesn’t interrogate the need for data centers or the trade-offs of building renewable power for AI training facilities versus to serve manufacturing or households. The report is just an exercise in asking whether, if these data centers are going to be developed, could they at least add as few emissions as possible? Not all hyperscalers care about climate, and those that do might still prioritize speed and scale over their net-zero commitments. But the authors argue that it’s possible to build these systems more quickly than it would be to install big gas turbines, which currently have at least three-year lead times to procure and fall under more complicated permitting regimes.
Before the New Year, I spoke with two of the authors — Zeke Hausfather from Stripe and Duncan Campbell from Scale Microgrids — about the report. Stripe doesn’t build data centers and has no plans to, but Hausfather works for a unit within the company called Stripe Climate, which has a “remit to work on impactful things,” he told me. He and his colleagues got interested in the climate dilemma of data centers, and enlisted Scale Microgrids and Paces to help investigate. Our conversation has been lightly edited for clarity.
Why weren’t off-grid solar microgrids really being considered before?
Zeke Hausfather: As AI has grown dramatically, there’s been much more demand for data centers specifically focused on training. Those data centers have a lot more relaxed requirements. Instead of serving millions of customer requests in real time, they’re running these incredibly energy intensive training models. Those don’t need to necessarily be located near where people live, and that unlocks a lot more potential for solar, because you need about 50 times more land to build a data center with off-grid solar and storage than you would to build a data center that had a grid connection.
The other change is that we’re simply running out of good grid connections. And so a lot of the conversation among data center developers has been focused on, is there a way to do this with off-grid natural gas? We think that it makes a lot more sense, particularly given the relaxed constraints of where you can build these, to go with solar and storage, gas back-up, and substantially reduce the emissions impact.
Duncan Campbell: It was funny, when Nan [Ransohoff, head of climate at Stripe] and Zeke first reached out to me, I feel like they convinced me that microgrids were a good idea, which was the first time this ever happened in my life. They were like, what do you think about off-grid solar and storage? Oh, the energy density is way off, you need a ton of land. They’re like, yeah, but you know, for training, you could put it out in the desert, it’s fine, and hyperscalers are doing crazy things right now to access this power. We just went through all these things, and by the end of the call, I was like, yeah, we should do this study. I wasn’t thinking about it this way until me, the microgrids guy, spoke to the payments company.
So it’s just kind of against conventional logic?
Campbell: Going off-grid at all is wild for a data center operator to consider, given the historical impulse was, let’s have 3x more backup generators than we need. Even the off-grid gas turbine proposals out there feel a little nuts. Then, to say solar, 1,000 acres of land, a million batteries — it’s just so unconventional, it’s almost heretical. But when you soberly assess the performance criteria and how the landscape has shifted, particularly access to the grid being problematic right now, but also different requirements for AI training and a very high willingness to pay — as we demonstrate in our reference case with the Three Mile Island restart — it makes sense.
Hausfather: We should be clear, when we talk about reliability, a data center with what we model, which is solar, batteries, and 125% capacity backup gas generators, is still probably going to achieve upwards of 99% reliability. It’s just not gonna be the 99.999% that’s traditionally been needed for serving customers with data centers. You can relax some of the requirements around that.
Can you explain how you went about investigating what it would mean for data centers to use off-grid solar microgrids?
Campbell: First we just built a pretty simple power flow model that says, if you’re in a given location, the solar panel is going to make this much power every hour of the year. And if you have a certain amount of demand and a certain amount of battery, the battery is going to charge and discharge these times to make the demand and supply match. And then when it can’t, your generators will kick on. So that model is just for a given solar-battery-generator combo in a given location. Then what we did is made a huge scenario suite in 50-megawatt increments. Now you can see, for any level of renewable-ness you want, here’s what the [levelized cost of energy] is.
Hausfather: As you approach 100%, the costs start increasing exponentially, which isn’t a new finding, but you’re essentially having to overbuild more and more solar and batteries in order to deal with those few hours of the year where you have extended periods of cloudiness. Which is why it makes a lot more sense, financially, to have a system with some gas generator use — unless you happen to be in a situation where you can actually only run your data center 90% of the time. I think that’s probably a little too heretical for anyone today, but we did include that as one of the cases.
Did you consider water use? Because when you zoom in on the Southwest, that seems like it could be a constraint.
Hausfather: We talked about water use a little bit, but it wasn’t a primary consideration. One of the reasons is that how data centers are designed has a big effect on net water use. There are a lot of designs now that are pretty low — close to zero — water use, because you’re cycling water through the system rather than using evaporative cooling as the primary approach.
What do you want the takeaway from this report to be? Should all data centers be doing this? To what extent do you think this can replace other options out there?
Hausfather: There is a land rush right now for building data centers quickly. While there’s a lot of exciting investment happening in clean, firm generation like the enhanced geothermal that Fervo is doing, none of those are going to be available at very large scales until after 2030. So if you’re building data centers right now and you don’t want to cause a ton of emissions and threaten your company’s net-zero targets or the social license for AI more broadly, this makes a lot of sense as an option. The cost premium above building a gas system is not that big.
Campbell: For me, it’s two things. I see one purpose of this white paper being to reset rules of thumb. There’s this vestigial knowledge we have that this is impossible, and no, this is totally possible. And it seems actually pretty reasonable.
The second part that I think is really radical is the gigantic scale implied by this solution. Every other solution being proposed is kind of like finding a needle in a haystack — if we find this old steel mill, we could use that interconnection to build a data center, or, you know, maybe we can get Exxon to make carbon capture work finally. If a hyperscaler just wanted to build 10 gigawatts of data centers, and wanted one plan to do it, I think this is the most compelling option. The scalability implied by this solution is a huge factor that should be considered.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
The Senate told renewables developers they’d have a year to start construction and still claim a tax break. Then came an executive order.
Renewable energy advocates breathed a sigh of relief after a last-minute change to the One Big Beautiful Bill Act stipulated that wind and solar projects would be eligible for tax credits as long as they began construction within the next 12 months.
But the new law left an opening for the Trump administration to cut that window short, and now Trump is moving to do just that. The president signed an executive order on Monday directing the Treasury Department to issue new guidance for the clean electricity tax credits “restricting the use of broad safe harbors unless a substantial portion of a subject facility has been built.”
The broad safe harbors in question have to do with the way the government defines the “beginning of construction,” which, in the realm of federal tax credits, is a term of art. Under the current Treasury guidance, developers must either complete “physical work of a significant nature” on a given project or spend at least 5% of its total cost to prove they have started construction during a given year, and are therefore protected from any subsequent tax law changes.
As my colleague Matthew Zeitlin previously reported, oftentimes something as simple as placing an order for certain pieces of equipment, like transformers or solar trackers, will check the box. Still, companies can’t just buy a bunch of equipment to qualify for the tax credits and then sit on it indefinitely. Their projects must be up and operating within four years, or else they must demonstrate “continuous progress” each year to continue to qualify.
As such, under existing rules and Trump’s new law, wind and solar developers would have 12 months to claim eligibility for the investment or production tax credit, and then at least four years to build the project and connect it to the grid. While a year is a much shorter runway than the open-ended extension to the tax credits granted by the Inflation Reduction Act, it’s a much better deal than the House’s original version of the OBBBA, which would have required projects to start construction within two months and be operating by the end of 2028 to qualify.
Or so it seemed.
The tax credits became a key bargaining chip during the final negotiations on the bill. Senator Lisa Murkowski of Alaska fought to retain the 12-month runway for wind and solar, while members of the House Freedom Caucus sought to kill it. Ultimately, the latter group agreed to vote yes after winning assurances from the president that he would “deal” with the subsidies later.
Last week, as all of this was unfolding, I started to hear rumors that the Treasury guidance regarding “beginning of construction” could be a key tool at the president’s disposal to make good on his promise. Industry groups had urged Congress to codify the existing guidance in the bill, but it was ultimately left out.
When I reached out to David Burton, a partner at Norton Rose Fulbright who specializes in energy tax credits, on Thursday, he was already contemplating Trump’s options to exploit that omission.
Burton told me that Trump’s Treasury department could redefine “beginning of construction” in a number of ways, such as by removing the 5% spending safe harbor or requiring companies to get certain permits in order to demonstrate “significant” physical work. It could also shorten the four-year grace period to bring a project to completion.
But Burton was skeptical that the Treasury Department had the staff or expertise to do the work of rewriting the guidance, let alone that Trump would make this a priority. “Does Treasury really want to spend the next couple of months dealing with this?” he said. “Or would it rather deal with implementing bonus depreciation and other taxpayer-favorable rules in the One Big Beautiful Bill instead of being stuck on this tangent, which will be quite a heavy lift and take some time?”
Just days after signing the bill into law, Trump chose the tangent, directing the Treasury to produce new guidance within 45 days. “It’s going to need every one of those days to come out with thoughtful guidance that can actually be applied by taxpayers,” Burton told me when I called him back on Monday night.
The executive order cites “energy dominance, national security, economic growth, and the fiscal health of the Nation” as reasons to end subsidies for wind and solar. The climate advocacy group Evergreen Action said it would help none of these objectives. “Trump is once again abusing his power in a blatant end-run around Congress — and even his own party,” Lena Moffit, the group’s executive director said in a statement. “He’s directing the government to sabotage the very industries that are lowering utility bills, creating jobs, and securing our energy independence.”
Industry groups were still assessing the implications of the executive order, and the ones I reached out to declined to comment for this story. “Now we’re circling the wagons back up to dig into the details,” one industry representative told me, adding that it was “shocking” that Trump would “seemingly double cross Senate leadership and Thune in particular.”
As everyone waits to see what Treasury officials come up with, developers will be racing to “start construction” as defined by the current rules, Burton said. It would be “quite unusual” if the new guidance were retroactive, he added. Although given Trump’s history, he said, “I guess anything is possible.”
“I believe the tariff on copper — we’re going to make it 50%.”
President Trump announced Tuesday during a cabinet meeting that he plans to impose a hefty tax on U.S. copper imports.
“I believe the tariff on copper — we’re going to make it 50%,” he told reporters.
Copper traders and producers have anticipated tariffs on copper since Trump announced in February that his administration would investigate the national security implications of copper imports, calling the metal an “essential material for national security, economic strength, and industrial resilience.”
Trump has already imposed tariffs for similarly strategically and economically important metals such as steel and aluminum. The process for imposing these tariffs under section 232 of the Trade Expansion Act of 1962 involves a finding by the Secretary of Commerce that the product being tariffed is essential to national security, and thus that the United States should be able to supply it on its own.
Copper has been referred to as the “metal of electrification” because of its centrality to a broad array of electrical technologies, including transmission lines, batteries, and electric motors. Electric vehicles contain around 180 pounds of copper on average. “Copper, scrap copper, and copper’s derivative products play a vital role in defense applications, infrastructure, and emerging technologies, including clean energy, electric vehicles, and advanced electronics,” the White House said in February.
Copper prices had risen around 25% this year through Monday. Prices for copper futures jumped by as much as 17% after the tariff announcement and are currently trading at around $5.50 a pound.
The tariffs, when implemented, could provide renewed impetus to expand copper mining in the United States. But tariffs can happen in a matter of months. A copper mine takes years to open — and that’s if investors decide to put the money toward the project in the first place. Congress took a swipe at the electric vehicle market in the U.S. last week, extinguishing subsidies for both consumers and manufacturers as part of the One Big Beautiful Bill Act. That will undoubtedly shrink domestic demand for EV inputs like copper, which could make investors nervous about sinking years and dollars into new or expanded copper mines.
Even if the Trump administration succeeds in its efforts to accelerate permitting for and construction of new copper mines, the copper will need to be smelted and refined before it can be used, and China dominates the copper smelting and refining industry.
The U.S. produced just over 1.1 million tons of copper in 2023, with 850,000 tons being mined from ore and the balance recycled from scrap, according to United States Geological Survey data. It imported almost 900,000 tons.
With the prospect of tariffs driving up prices for domestically mined ore, the immediate beneficiaries are those who already have mines. Shares in Freeport-McMoRan, which operates seven copper mines in Arizona and New Mexico, were up over 4.5% in afternoon trading Tuesday.
Predicting the location and severity of thunderstorms is at the cutting edge of weather science. Now funding for that science is at risk.
Tropical Storm Barry was, by all measures, a boring storm. “Blink and you missed it,” as a piece in Yale Climate Connections put it after Barry formed, then dissipated over 24 hours in late June, having never sustained wind speeds higher than 45 miles per hour. The tropical storm’s main impact, it seemed at the time, was “heavy rains of three to six inches, which likely caused minor flooding” in Tampico, Mexico, where it made landfall.
But a few days later, U.S. meteorologists started to get concerned. The remnants of Barry had swirled northward, pooling wet Gulf air over southern and central Texas and elevating the atmospheric moisture to reach or exceed record levels for July. “Like a waterlogged sponge perched precariously overhead, all the atmosphere needed was a catalyst to wring out the extreme levels of water vapor,” meteorologist Mike Lowry wrote.
More than 100 people — many of them children — ultimately died as extreme rainfall caused the Guadalupe River to rise 34 feet in 90 minutes. But the tragedy was “not really a failure of meteorology,” UCLA and UC Agriculture and Natural Resources climate scientist Daniel Swain said during a public “Office Hours” review of the disaster on Monday. The National Weather Service in San Antonio and Austin first warned the public of the potential for heavy rain on Sunday, June 29 — five days before the floods crested. The agency followed that with a flood watch warning for the Kerrville area on Thursday, July 3, then issued an additional 21 warnings, culminating just after 1 a.m. on Friday, July 4, with a wireless emergency alert sent to the phones of residents, campers, and RVers along the Guadalupe River.
The NWS alerts were both timely and accurate, and even correctly predicted an expected rainfall rate of 2 to 3 inches per hour. If it were possible to consider the science alone, the official response might have been deemed a success.
Of all the storm systems, convective storms — like thunderstorms, hail, tornadoes, and extreme rainstorms — are some of the most difficult to forecast. “We don’t have very good observations of some of these fine-scale weather extremes,” Swain told me after office hours were over, in reference to severe meteorological events that are often relatively short-lived and occur in small geographic areas. “We only know a tornado occurred, for example, if people report it and the Weather Service meteorologists go out afterward and look to see if there’s a circular, radial damage pattern.” A hurricane, by contrast, spans hundreds of miles and is visible from space.
Global weather models, which predict conditions at a planetary scale, are relatively coarse in their spatial resolution and “did not do the best job with this event,” Swain said during his office hours. “They predicted some rain, locally heavy, but nothing anywhere near what transpired.” (And before you ask — artificial intelligence-powered weather models were among the worst at predicting the Texas floods.)
Over the past decade or so, however, due to the unique convective storm risks in the United States, the National Oceanic and Atmospheric Administration and other meteorological agencies have developed specialized high resolution convection-resolving models to better represent and forecast extreme thunderstorms and rainstorms.
NOAA’s cutting-edge specialized models “got this right,” Swain told me of the Texas storms. “Those were the models that alerted the local weather service and the NOAA Weather Prediction Center of the potential for an extreme rain event. That is why the flash flood watches were issued so early, and why there was so much advanced knowledge.”
Writing for The Eyewall, meteorologist Matt Lanza concurred with Swain’s assessment: “By Thursday morning, the [high resolution] model showed as much as 10 to 13 inches in parts of Texas,” he wrote. “By Thursday evening, that was as much as 20 inches. So the [high resolution] model upped the ante all day.”
Most models initialized at 00Z last night indicated the potential for localized excessive rainfall over portions of south-central Texas that led to the tragic and deadly flash flood early this morning. pic.twitter.com/t3DpCfc7dX
— Jeff Frame (@VORTEXJeff) July 4, 2025
To be any more accurate than they ultimately were on the Texas floods, meteorologists would have needed the ability to predict the precise location and volume of rainfall of an individual thunderstorm cell. Although models can provide a fairly accurate picture of the general area where a storm will form, the best current science still can’t achieve that level of precision more than a few hours in advance of a given event.
Climate change itself is another factor making storm behavior even less predictable. “If it weren’t so hot outside, if it wasn’t so humid, if the atmosphere wasn’t holding all that water, then [the system] would have rained and marched along as the storm drifted,” Claudia Benitez-Nelson, an expert on flooding at the University of South Carolina, told me. Instead, slow and low prevailing winds caused the system to stall, pinning it over the same worst-case-scenario location at the confluence of the Hill Country rivers for hours and challenging the limits of science and forecasting.
Though it’s tempting to blame the Trump administration cuts to the staff and budget of the NWS for the tragedy, the local NWS actually had more forecasters on hand than usual in its local field office ahead of the storm, in anticipation of potential disaster. Any budget cuts to the NWS, while potentially disastrous, would not go into effect until fiscal year 2026.
The proposed 2026 budget for NOAA, however, would zero out the upkeep of the models, as well as shutter the National Severe Storms Laboratory in Norman, Oklahoma, which studies thunderstorms and rainstorms, such as the one in Texas. And due to the proprietary, U.S.-specific nature of the high-resolution models, there is no one coming to our rescue if they’re eliminated or degraded by the cuts.
The impending cuts are alarming to the scientists charged with maintaining and adjusting the models to ensure maximum accuracy, too. Computationally, it’s no small task to keep them running 24 hours a day, every day of the year. A weather model doesn’t simply run on its own indefinitely, but rather requires large data transfers as well as intakes of new conditions from its network of observation stations to remain reliable. Although the NOAA high-resolution models have been in use for about a decade, yearly updates keep the programs on the cutting edge of weather science; without constant tweaks, the models’ accuracy slowly degrades as the atmosphere changes and information and technologies become outdated.
It’s difficult to imagine that the Texas floods could have been more catastrophic, and yet the NOAA models and NWS warnings and alerts undoubtedly saved lives. Still, local Texas authorities have attempted to pass the blame, claiming they weren’t adequately informed of the dangers by forecasters. The picture will become clearer as reporting continues to probe why the flood-prone region did not have warning sirens, why camp counselors did not have their phones to receive overnight NWS alarms, why there were not more flood gauges on the rivers, and what, if anything, local officials could have done to save more people. Still, given what is scientifically possible at this stage of modeling, “This was not a forecast failure relative to scientific or weather prediction best practices. That much is clear,” Swain said.
As the climate warms and extreme rainfall events increase as a result, however, it will become ever more crucial to have access to cutting-edge weather models. “What I want to bring attention to is that this is not a one-off,” Benitez-Nelson, the flood expert at the University of South Carolina, told me. “There’s this temptation to say, ‘Oh, it’s a 100-year storm, it’s a 1,000-year storm.’”
“No,” she went on. “This is a growing pattern.”