You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
There are two kinds of people who work on climate solutions: Those who still believe in the promise of carbon markets, and those who think the whole concept is fundamentally flawed.
In the first category, you have people like McGee Young, the CEO of a company called WattCarbon. Young is aware of the ways carbon markets can be a race to the bottom — enabling companies to buy cheap certificates that say they used clean energy or reduced their carbon footprint, when in reality their purchase had little effect on the environment or the energy system.
And yet, there’s all this money out there for the taking! Companies want to green their image! Tackling climate change is expensive! There must be a way to funnel corporate sustainability budgets to where they can make a real impact!
To Young, the solution is a matter of better data and greater transparency. “We need a record-keeping system that allows us to raise the bar,” he told me.
Young launched his vision for that record-keeping system on Wednesday — the WattCarbon Energy Attribute Tracking System, or WEATS. It functions similarly to other environmental credit registries: Owners of clean energy assets can sign up to generate credits known as Environmental Attribute Certificates, or EACs, which buyers can then purchase to count toward their own clean energy or carbon goals.
WEATS has two main features that differentiate it. First, it will include credits from small-scale distributed energy resources like residential solar panels, batteries, and heat pumps — clean energy solutions that haven’t really been able to participate in carbon markets until now. Second, each EAC will include granular information about where and when the power was generated, in the case of solar, or the carbon savings incurred, in the case of heat pumps, down to the hour.
The first feature is part of what motivated Young to start WattCarbon. “The clean energy transition is more than just wind and solar, it’s more than just generation,” he told me. But it’s the second that Young said is key to improving the credibility of claims that companies are “using 100% clean energy,” or “achieving net-zero.”
Today, many companies simply buy enough clean energy credits to match their annual energy use, regardless of where or when the energy was generated. But researchers have shown that this strategy can have little to no impact on emissions. For example, if a company is only buying solar credits, but it is using energy at night, its carbon footprint from that nighttime energy could surpass any environmental benefits of the solar it bought.
To solve this, some energy buyers have embraced a concept called “24/7 carbon-free energy,” which means that “every kilowatt-hour of electricity consumption is met with carbon-free electricity sources, every hour of every day, everywhere,” in the words of a United Nations-led initiative to promote the concept. “It is both the end state of a fully decarbonized electricity system,” according to the UN, “and a transformative approach to energy procurement, supply, and policy design that is critical to accelerating its arrival.”
If you’ve followed the recent debate about the green hydrogen tax credit, you might be familiar with the idea. In December, the Treasury Department proposed that hydrogen producers will have to match their electricity consumption with the purchase of local clean electricity generation on an hourly basis to prove their hydrogen is clean enough to qualify for the full value of the tax credit. That means producers can either hook up directly to a solar farm or wind farm or geothermal power plant and operate only when it is generating power, or, it can buy renewable energy credits or EACs that correspond to the hours that it operates.
WattCarbon’s marketplace is one of the first to enable this by requiring sellers to include data about exactly where and when each EAC was produced. It also include the carbon intensity of the grid in the place and time when that unit of power was produced. For example, 1 megawatt-hour of solar power in West Virginia, where the grid is supplied by a lot of coal-fired power plants, would likely reduce emissions far more than 1 megawatt-hour of solar power in California, where the main fossil fuel burned for power is natural gas. Similarly, 1 megawatt-hour of solar generated in the afternoon in California will not do as much to reduce emissions as if that unit of power were stored in a battery and then dispatched at night. On other markets, all of these credits might simply be advertised as 1 megawatt-hour of solar power, and the buyer would be none the wiser.
So what does this new carbon trading marketplace look like in practice? There are a lot of possibilities, but here’s one scenario. WattCarbon partners with a company that helps homeowners electrify their heating or install and manage their solar and battery systems. That third party company can then say to their customers, “As an extra incentive to do this, we can help you sell the environmental benefits it provides to third parties through the WattCarbon marketplace,” and those extra payments are what convinces the homeowner to go for it.
Independent experts I spoke with were cautiously optimistic about what this new marketplace could do. “We need to deploy on the order of a billion machines, in the U.S. alone — and not over a century, but on the order of a decade,” said Kevin Kircher, an assistant professor of mechanical engineering at Purdue University, whose research focuses on heat pumps and other distributed energy resources. “So there’s a lot that needs to be done, and just connecting people to money to do the work is really important.”
Wilson Ricks, a PhD candidate at Princeton University whose research informed the Treasury’s proposal for the hydrogen tax credit, said that having a platform where hydrogen companies can procure clean energy from a variety of projects, and with time and location data, would be very useful. He was also intrigued by WattCarbon’s attempt to create EACs tied to batteries because energy storage systems are one of the few resources that can produce clean power when the wind isn’t blowing and the sun isn’t shining.
But both Ricks and Kircher warned there are a number of ways this system of credits could fall into the same traps that ensnare many carbon offset projects and reduce their credibility. For one, it’s really hard to get the math right. That’s especially true for a project like a heat pump, where the carbon savings are based on a counterfactual situation where the homeowner would have kept their gas heater. You have to basically estimate how often they would have run it, which opens the door to sloppiness at best and fraud at worst.
Another key criterion — a concept called additionality — is very hard to assess. Would the household that switches to a heat pump have done so regardless of whether they were getting extra revenue from selling EACs? If the answer is unequivocally yes, the credits are meaningless and serve to give corporate emitters an excuse to keep emitting.
Young acknowledged to me that this was likely going to be true in some cases, but still felt that heat pump owners deserved to be paid for the environmental benefits they were providing. “We provide environmental subsidies for large-scale wind and solar, and we don't do that for the things that we're putting into our buildings and our communities. And to me, there’s an inherent inequality in the way that we treat and value clean energy that needs to be addressed.”
That didn’t quite make sense to me — the government provides subsidies for all kinds of clean energy resources, including distributed energy resources, I countered. The Treasury will give you $2,000 for a heat pump and a 30% discount on rooftop solar.
“That’s true,” Young said. “But we don’t have enough money in all of our government programs to truly scale those.”
I couldn’t argue with that. But the real challenge is helping low-income homeowners with the upfront capital to install these devices — after-the-fact payments are not enough. Young said he had plans to create a way for companies to procure EACs in advance from groups of homeowners. The deals would be similar to the power purchase agreements that big electricity consumers like Google and Walmart make with large-scale renewable energy developers, helping to finance those projects by reducing the risk.
“This is a necessary but not sufficient step,” Young said of the version of the marketplace that launched Wednesday. “Without this, we can’t do that. But this by itself would be inadequate for the market to be able to reach its fullest potential.”
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
The Senate told renewables developers they’d have a year to start construction and still claim a tax break. Then came an executive order.
Renewable energy advocates breathed a sigh of relief after a last-minute change to the One Big Beautiful Bill Act stipulated that wind and solar projects would be eligible for tax credits as long as they began construction within the next 12 months.
But the new law left an opening for the Trump administration to cut that window short, and now Trump is moving to do just that. The president signed an executive order on Monday directing the Treasury Department to issue new guidance for the clean electricity tax credits “restricting the use of broad safe harbors unless a substantial portion of a subject facility has been built.”
The broad safe harbors in question have to do with the way the government defines the “beginning of construction,” which, in the realm of federal tax credits, is a term of art. Under the current Treasury guidance, developers must either complete “physical work of a significant nature” on a given project or spend at least 5% of its total cost to prove they have started construction during a given year, and are therefore protected from any subsequent tax law changes.
As my colleague Matthew Zeitlin previously reported, oftentimes something as simple as placing an order for certain pieces of equipment, like transformers or solar trackers, will check the box. Still, companies can’t just buy a bunch of equipment to qualify for the tax credits and then sit on it indefinitely. Their projects must be up and operating within four years, or else they must demonstrate “continuous progress” each year to continue to qualify.
As such, under existing rules and Trump’s new law, wind and solar developers would have 12 months to claim eligibility for the investment or production tax credit, and then at least four years to build the project and connect it to the grid. While a year is a much shorter runway than the open-ended extension to the tax credits granted by the Inflation Reduction Act, it’s a much better deal than the House’s original version of the OBBBA, which would have required projects to start construction within two months and be operating by the end of 2028 to qualify.
Or so it seemed.
The tax credits became a key bargaining chip during the final negotiations on the bill. Senator Lisa Murkowski of Alaska fought to retain the 12-month runway for wind and solar, while members of the House Freedom Caucus sought to kill it. Ultimately, the latter group agreed to vote yes after winning assurances from the president that he would “deal” with the subsidies later.
Last week, as all of this was unfolding, I started to hear rumors that the Treasury guidance regarding “beginning of construction” could be a key tool at the president’s disposal to make good on his promise. Industry groups had urged Congress to codify the existing guidance in the bill, but it was ultimately left out.
When I reached out to David Burton, a partner at Norton Rose Fulbright who specializes in energy tax credits, on Thursday, he was already contemplating Trump’s options to exploit that omission.
Burton told me that Trump’s Treasury department could redefine “beginning of construction” in a number of ways, such as by removing the 5% spending safe harbor or requiring companies to get certain permits in order to demonstrate “significant” physical work. It could also shorten the four-year grace period to bring a project to completion.
But Burton was skeptical that the Treasury Department had the staff or expertise to do the work of rewriting the guidance, let alone that Trump would make this a priority. “Does Treasury really want to spend the next couple of months dealing with this?” he said. “Or would it rather deal with implementing bonus depreciation and other taxpayer-favorable rules in the One Big Beautiful Bill instead of being stuck on this tangent, which will be quite a heavy lift and take some time?”
Just days after signing the bill into law, Trump chose the tangent, directing the Treasury to produce new guidance within 45 days. “It’s going to need every one of those days to come out with thoughtful guidance that can actually be applied by taxpayers,” Burton told me when I called him back on Monday night.
The executive order cites “energy dominance, national security, economic growth, and the fiscal health of the Nation” as reasons to end subsidies for wind and solar. The climate advocacy group Evergreen Action said it would help none of these objectives. “Trump is once again abusing his power in a blatant end-run around Congress — and even his own party,” Lena Moffit, the group’s executive director said in a statement. “He’s directing the government to sabotage the very industries that are lowering utility bills, creating jobs, and securing our energy independence.”
Industry groups were still assessing the implications of the executive order, and the ones I reached out to declined to comment for this story. “Now we’re circling the wagons back up to dig into the details,” one industry representative told me, adding that it was “shocking” that Trump would “seemingly double cross Senate leadership and Thune in particular.”
As everyone waits to see what Treasury officials come up with, developers will be racing to “start construction” as defined by the current rules, Burton said. It would be “quite unusual” if the new guidance were retroactive, he added. Although given Trump’s history, he said, “I guess anything is possible.”
“I believe the tariff on copper — we’re going to make it 50%.”
President Trump announced Tuesday during a cabinet meeting that he plans to impose a hefty tax on U.S. copper imports.
“I believe the tariff on copper — we’re going to make it 50%,” he told reporters.
Copper traders and producers have anticipated tariffs on copper since Trump announced in February that his administration would investigate the national security implications of copper imports, calling the metal an “essential material for national security, economic strength, and industrial resilience.”
Trump has already imposed tariffs for similarly strategically and economically important metals such as steel and aluminum. The process for imposing these tariffs under section 232 of the Trade Expansion Act of 1962 involves a finding by the Secretary of Commerce that the product being tariffed is essential to national security, and thus that the United States should be able to supply it on its own.
Copper has been referred to as the “metal of electrification” because of its centrality to a broad array of electrical technologies, including transmission lines, batteries, and electric motors. Electric vehicles contain around 180 pounds of copper on average. “Copper, scrap copper, and copper’s derivative products play a vital role in defense applications, infrastructure, and emerging technologies, including clean energy, electric vehicles, and advanced electronics,” the White House said in February.
Copper prices had risen around 25% this year through Monday. Prices for copper futures jumped by as much as 17% after the tariff announcement and are currently trading at around $5.50 a pound.
The tariffs, when implemented, could provide renewed impetus to expand copper mining in the United States. But tariffs can happen in a matter of months. A copper mine takes years to open — and that’s if investors decide to put the money toward the project in the first place. Congress took a swipe at the electric vehicle market in the U.S. last week, extinguishing subsidies for both consumers and manufacturers as part of the One Big Beautiful Bill Act. That will undoubtedly shrink domestic demand for EV inputs like copper, which could make investors nervous about sinking years and dollars into new or expanded copper mines.
Even if the Trump administration succeeds in its efforts to accelerate permitting for and construction of new copper mines, the copper will need to be smelted and refined before it can be used, and China dominates the copper smelting and refining industry.
The U.S. produced just over 1.1 million tons of copper in 2023, with 850,000 tons being mined from ore and the balance recycled from scrap, according to United States Geological Survey data. It imported almost 900,000 tons.
With the prospect of tariffs driving up prices for domestically mined ore, the immediate beneficiaries are those who already have mines. Shares in Freeport-McMoRan, which operates seven copper mines in Arizona and New Mexico, were up over 4.5% in afternoon trading Tuesday.
Predicting the location and severity of thunderstorms is at the cutting edge of weather science. Now funding for that science is at risk.
Tropical Storm Barry was, by all measures, a boring storm. “Blink and you missed it,” as a piece in Yale Climate Connections put it after Barry formed, then dissipated over 24 hours in late June, having never sustained wind speeds higher than 45 miles per hour. The tropical storm’s main impact, it seemed at the time, was “heavy rains of three to six inches, which likely caused minor flooding” in Tampico, Mexico, where it made landfall.
But a few days later, U.S. meteorologists started to get concerned. The remnants of Barry had swirled northward, pooling wet Gulf air over southern and central Texas and elevating the atmospheric moisture to reach or exceed record levels for July. “Like a waterlogged sponge perched precariously overhead, all the atmosphere needed was a catalyst to wring out the extreme levels of water vapor,” meteorologist Mike Lowry wrote.
More than 100 people — many of them children — ultimately died as extreme rainfall caused the Guadalupe River to rise 34 feet in 90 minutes. But the tragedy was “not really a failure of meteorology,” UCLA and UC Agriculture and Natural Resources climate scientist Daniel Swain said during a public “Office Hours” review of the disaster on Monday. The National Weather Service in San Antonio and Austin first warned the public of the potential for heavy rain on Sunday, June 29 — five days before the floods crested. The agency followed that with a flood watch warning for the Kerrville area on Thursday, July 3, then issued an additional 21 warnings, culminating just after 1 a.m. on Friday, July 4, with a wireless emergency alert sent to the phones of residents, campers, and RVers along the Guadalupe River.
The NWS alerts were both timely and accurate, and even correctly predicted an expected rainfall rate of 2 to 3 inches per hour. If it were possible to consider the science alone, the official response might have been deemed a success.
Of all the storm systems, convective storms — like thunderstorms, hail, tornadoes, and extreme rainstorms — are some of the most difficult to forecast. “We don’t have very good observations of some of these fine-scale weather extremes,” Swain told me after office hours were over, in reference to severe meteorological events that are often relatively short-lived and occur in small geographic areas. “We only know a tornado occurred, for example, if people report it and the Weather Service meteorologists go out afterward and look to see if there’s a circular, radial damage pattern.” A hurricane, by contrast, spans hundreds of miles and is visible from space.
Global weather models, which predict conditions at a planetary scale, are relatively coarse in their spatial resolution and “did not do the best job with this event,” Swain said during his office hours. “They predicted some rain, locally heavy, but nothing anywhere near what transpired.” (And before you ask — artificial intelligence-powered weather models were among the worst at predicting the Texas floods.)
Over the past decade or so, however, due to the unique convective storm risks in the United States, the National Oceanic and Atmospheric Administration and other meteorological agencies have developed specialized high resolution convection-resolving models to better represent and forecast extreme thunderstorms and rainstorms.
NOAA’s cutting-edge specialized models “got this right,” Swain told me of the Texas storms. “Those were the models that alerted the local weather service and the NOAA Weather Prediction Center of the potential for an extreme rain event. That is why the flash flood watches were issued so early, and why there was so much advanced knowledge.”
Writing for The Eyewall, meteorologist Matt Lanza concurred with Swain’s assessment: “By Thursday morning, the [high resolution] model showed as much as 10 to 13 inches in parts of Texas,” he wrote. “By Thursday evening, that was as much as 20 inches. So the [high resolution] model upped the ante all day.”
Most models initialized at 00Z last night indicated the potential for localized excessive rainfall over portions of south-central Texas that led to the tragic and deadly flash flood early this morning. pic.twitter.com/t3DpCfc7dX
— Jeff Frame (@VORTEXJeff) July 4, 2025
To be any more accurate than they ultimately were on the Texas floods, meteorologists would have needed the ability to predict the precise location and volume of rainfall of an individual thunderstorm cell. Although models can provide a fairly accurate picture of the general area where a storm will form, the best current science still can’t achieve that level of precision more than a few hours in advance of a given event.
Climate change itself is another factor making storm behavior even less predictable. “If it weren’t so hot outside, if it wasn’t so humid, if the atmosphere wasn’t holding all that water, then [the system] would have rained and marched along as the storm drifted,” Claudia Benitez-Nelson, an expert on flooding at the University of South Carolina, told me. Instead, slow and low prevailing winds caused the system to stall, pinning it over the same worst-case-scenario location at the confluence of the Hill Country rivers for hours and challenging the limits of science and forecasting.
Though it’s tempting to blame the Trump administration cuts to the staff and budget of the NWS for the tragedy, the local NWS actually had more forecasters on hand than usual in its local field office ahead of the storm, in anticipation of potential disaster. Any budget cuts to the NWS, while potentially disastrous, would not go into effect until fiscal year 2026.
The proposed 2026 budget for NOAA, however, would zero out the upkeep of the models, as well as shutter the National Severe Storms Laboratory in Norman, Oklahoma, which studies thunderstorms and rainstorms, such as the one in Texas. And due to the proprietary, U.S.-specific nature of the high-resolution models, there is no one coming to our rescue if they’re eliminated or degraded by the cuts.
The impending cuts are alarming to the scientists charged with maintaining and adjusting the models to ensure maximum accuracy, too. Computationally, it’s no small task to keep them running 24 hours a day, every day of the year. A weather model doesn’t simply run on its own indefinitely, but rather requires large data transfers as well as intakes of new conditions from its network of observation stations to remain reliable. Although the NOAA high-resolution models have been in use for about a decade, yearly updates keep the programs on the cutting edge of weather science; without constant tweaks, the models’ accuracy slowly degrades as the atmosphere changes and information and technologies become outdated.
It’s difficult to imagine that the Texas floods could have been more catastrophic, and yet the NOAA models and NWS warnings and alerts undoubtedly saved lives. Still, local Texas authorities have attempted to pass the blame, claiming they weren’t adequately informed of the dangers by forecasters. The picture will become clearer as reporting continues to probe why the flood-prone region did not have warning sirens, why camp counselors did not have their phones to receive overnight NWS alarms, why there were not more flood gauges on the rivers, and what, if anything, local officials could have done to save more people. Still, given what is scientifically possible at this stage of modeling, “This was not a forecast failure relative to scientific or weather prediction best practices. That much is clear,” Swain said.
As the climate warms and extreme rainfall events increase as a result, however, it will become ever more crucial to have access to cutting-edge weather models. “What I want to bring attention to is that this is not a one-off,” Benitez-Nelson, the flood expert at the University of South Carolina, told me. “There’s this temptation to say, ‘Oh, it’s a 100-year storm, it’s a 1,000-year storm.’”
“No,” she went on. “This is a growing pattern.”