You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
Predicting the location and severity of thunderstorms is at the cutting edge of weather science. Now funding for that science is at risk.

Tropical Storm Barry was, by all measures, a boring storm. “Blink and you missed it,” as a piece in Yale Climate Connections put it after Barry formed, then dissipated over 24 hours in late June, having never sustained wind speeds higher than 45 miles per hour. The tropical storm’s main impact, it seemed at the time, was “heavy rains of three to six inches, which likely caused minor flooding” in Tampico, Mexico, where it made landfall.
But a few days later, U.S. meteorologists started to get concerned. The remnants of Barry had swirled northward, pooling wet Gulf air over southern and central Texas and elevating the atmospheric moisture to reach or exceed record levels for July. “Like a waterlogged sponge perched precariously overhead, all the atmosphere needed was a catalyst to wring out the extreme levels of water vapor,” meteorologist Mike Lowry wrote.
More than 100 people — many of them children — ultimately died as extreme rainfall caused the Guadalupe River to rise 34 feet in 90 minutes. But the tragedy was “not really a failure of meteorology,” UCLA and UC Agriculture and Natural Resources climate scientist Daniel Swain said during a public “Office Hours” review of the disaster on Monday. The National Weather Service in San Antonio and Austin first warned the public of the potential for heavy rain on Sunday, June 29 — five days before the floods crested. The agency followed that with a flood watch warning for the Kerrville area on Thursday, July 3, then issued an additional 21 warnings, culminating just after 1 a.m. on Friday, July 4, with a wireless emergency alert sent to the phones of residents, campers, and RVers along the Guadalupe River.
The NWS alerts were both timely and accurate, and even correctly predicted an expected rainfall rate of 2 to 3 inches per hour. If it were possible to consider the science alone, the official response might have been deemed a success.
Of all the storm systems, convective storms — like thunderstorms, hail, tornadoes, and extreme rainstorms — are some of the most difficult to forecast. “We don’t have very good observations of some of these fine-scale weather extremes,” Swain told me after office hours were over, in reference to severe meteorological events that are often relatively short-lived and occur in small geographic areas. “We only know a tornado occurred, for example, if people report it and the Weather Service meteorologists go out afterward and look to see if there’s a circular, radial damage pattern.” A hurricane, by contrast, spans hundreds of miles and is visible from space.
Global weather models, which predict conditions at a planetary scale, are relatively coarse in their spatial resolution and “did not do the best job with this event,” Swain said during his office hours. “They predicted some rain, locally heavy, but nothing anywhere near what transpired.” (And before you ask — artificial intelligence-powered weather models were among the worst at predicting the Texas floods.)
Over the past decade or so, however, due to the unique convective storm risks in the United States, the National Oceanic and Atmospheric Administration and other meteorological agencies have developed specialized high resolution convection-resolving models to better represent and forecast extreme thunderstorms and rainstorms.
NOAA’s cutting-edge specialized models “got this right,” Swain told me of the Texas storms. “Those were the models that alerted the local weather service and the NOAA Weather Prediction Center of the potential for an extreme rain event. That is why the flash flood watches were issued so early, and why there was so much advanced knowledge.”
Writing for The Eyewall, meteorologist Matt Lanza concurred with Swain’s assessment: “By Thursday morning, the [high resolution] model showed as much as 10 to 13 inches in parts of Texas,” he wrote. “By Thursday evening, that was as much as 20 inches. So the [high resolution] model upped the ante all day.”
Most models initialized at 00Z last night indicated the potential for localized excessive rainfall over portions of south-central Texas that led to the tragic and deadly flash flood early this morning. pic.twitter.com/t3DpCfc7dX
— Jeff Frame (@VORTEXJeff) July 4, 2025
To be any more accurate than they ultimately were on the Texas floods, meteorologists would have needed the ability to predict the precise location and volume of rainfall of an individual thunderstorm cell. Although models can provide a fairly accurate picture of the general area where a storm will form, the best current science still can’t achieve that level of precision more than a few hours in advance of a given event.
Climate change itself is another factor making storm behavior even less predictable. “If it weren’t so hot outside, if it wasn’t so humid, if the atmosphere wasn’t holding all that water, then [the system] would have rained and marched along as the storm drifted,” Claudia Benitez-Nelson, an expert on flooding at the University of South Carolina, told me. Instead, slow and low prevailing winds caused the system to stall, pinning it over the same worst-case-scenario location at the confluence of the Hill Country rivers for hours and challenging the limits of science and forecasting.
Though it’s tempting to blame the Trump administration cuts to the staff and budget of the NWS for the tragedy, the local NWS actually had more forecasters on hand than usual in its local field office ahead of the storm, in anticipation of potential disaster. Any budget cuts to the NWS, while potentially disastrous, would not go into effect until fiscal year 2026.
The proposed 2026 budget for NOAA, however, would zero out the upkeep of the models, as well as shutter the National Severe Storms Laboratory in Norman, Oklahoma, which studies thunderstorms and rainstorms, such as the one in Texas. And due to the proprietary, U.S.-specific nature of the high-resolution models, there is no one coming to our rescue if they’re eliminated or degraded by the cuts.
The impending cuts are alarming to the scientists charged with maintaining and adjusting the models to ensure maximum accuracy, too. Computationally, it’s no small task to keep them running 24 hours a day, every day of the year. A weather model doesn’t simply run on its own indefinitely, but rather requires large data transfers as well as intakes of new conditions from its network of observation stations to remain reliable. Although the NOAA high-resolution models have been in use for about a decade, yearly updates keep the programs on the cutting edge of weather science; without constant tweaks, the models’ accuracy slowly degrades as the atmosphere changes and information and technologies become outdated.
It’s difficult to imagine that the Texas floods could have been more catastrophic, and yet the NOAA models and NWS warnings and alerts undoubtedly saved lives. Still, local Texas authorities have attempted to pass the blame, claiming they weren’t adequately informed of the dangers by forecasters. The picture will become clearer as reporting continues to probe why the flood-prone region did not have warning sirens, why camp counselors did not have their phones to receive overnight NWS alarms, why there were not more flood gauges on the rivers, and what, if anything, local officials could have done to save more people. Still, given what is scientifically possible at this stage of modeling, “This was not a forecast failure relative to scientific or weather prediction best practices. That much is clear,” Swain said.
As the climate warms and extreme rainfall events increase as a result, however, it will become ever more crucial to have access to cutting-edge weather models. “What I want to bring attention to is that this is not a one-off,” Benitez-Nelson, the flood expert at the University of South Carolina, told me. “There’s this temptation to say, ‘Oh, it’s a 100-year storm, it’s a 1,000-year storm.’”
“No,” she went on. “This is a growing pattern.”
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
According to a new analysis shared exclusively with Heatmap, coal’s equipment-related outage rate is about twice as high as wind’s.
The Trump administration wants “beautiful clean coal” to return to its place of pride on the electric grid because, it says, wind and solar are just too unreliable. “If we want to keep the lights on and prevent blackouts from happening, then we need to keep our coal plants running. Affordable, reliable and secure energy sources are common sense,” Chris Wright said on X in July, in what has become a steady drumbeat from the administration that has sought to subsidize coal and put a regulatory straitjacket around solar and (especially) wind.
This has meant real money spent in support of existing coal plants. The administration’s emergency order to keep Michigan’s J.H. Campbell coal plant open (“to secure grid reliability”), for example, has cost ratepayers served by Michigan utility Consumers Energy some $80 million all on its own.
But … how reliable is coal, actually? According to an analysis by the Environmental Defense Fund of data from the North American Electric Reliability Corporation, a nonprofit that oversees reliability standards for the grid, coal has the highest “equipment-related outage rate” — essentially, the percentage of time a generator isn’t working because of some kind of mechanical or other issue related to its physical structure — among coal, hydropower, natural gas, nuclear, and wind. Coal’s outage rate was over 12%. Wind’s was about 6.6%.
“When EDF’s team isolated just equipment-related outages, wind energy proved far more reliable than coal, which had the highest outage rate of any source NERC tracks,” EDF told me in an emailed statement.
Coal’s reliability has, in fact, been decreasing, Oliver Chapman, a research analyst at EDF, told me.
NERC has attributed this falling reliability to the changing role of coal in the energy system. Reliability “negatively correlates most strongly to capacity factor,” or how often the plant is running compared to its peak capacity. The data also “aligns with industry statements indicating that reduced investment in maintenance and abnormal cycling that are being adopted primarily in response to rapid changes in the resource mix are negatively impacting baseload coal unit performance.” In other words, coal is struggling to keep up with its changing role in the energy system. That’s due not just to the growth of solar and wind energy, which are inherently (but predictably) variable, but also to natural gas’s increasing prominence on the grid.
“When coal plants are having to be a bit more varied in their generation, we're seeing that wear and tear of those plants is increasing,” Chapman said. “The assumption is that that's only going to go up in future years.”
The issue for any plan to revitalize the coal industry, Chapman told me, is that the forces driving coal into this secondary role — namely the economics of running aging plants compared to natural gas and renewables — do not seem likely to reverse themselves any time soon.
Coal has been “sort of continuously pushed a bit more to the sidelines by renewables and natural gas being cheaper sources for utilities to generate their power. This increased marginalization is going to continue to lead to greater wear and tear on these plants,” Chapman said.
But with electricity demand increasing across the country, coal is being forced into a role that it might not be able to easily — or affordably — play, all while leading to more emissions of sulfur dioxide, nitrogen oxide, particulate matter, mercury, and, of course, carbon dioxide.
The coal system has been beset by a number of high-profile outages recently, including at the largest new coal plant in the country, Sandy Creek in Texas, which could be offline until early 2027, according to the Texas energy market ERCOT and the Institute for Energy Economics and Financial Analysis.
In at least one case, coal’s reliability issues were cited as a reason to keep another coal generating unit open past its planned retirement date.
Last month, Colorado Representative Will Hurd wrote a letter to the Department of Energy asking for emergency action to keep Unit 2 of the Comanche coal plant in Pueblo, Colorado open past its scheduled retirement at the end of his year. Hurd cited “mechanical and regulatory constraints” for the larger Unit 3 as a justification for keeping Unit 2 open, to fill in the generation gap left by the larger unit. In a filing by Xcel and several Colorado state energy officials also requesting delaying the retirement of Unit 2, they disclosed that the larger Unit 3 “experienced an unplanned outage and is offline through at least June 2026.”
Reliability issues aside, high electricity demand may turn into short-term profits at all levels of the coal industry, from the miners to the power plants.
At the same time the Trump administration is pushing coal plants to stay open past their scheduled retirement, the Energy Information Administration is forecasting that natural gas prices will continue to rise, which could lead to increased use of coal for electricity generation. The EIA forecasts that the 2025 average price of natural gas for power plants will rise 37% from 2024 levels.
Analysts at S&P Global Commodity Insights project “a continued rebound in thermal coal consumption throughout 2026 as thermal coal prices remain competitive with short-term natural gas prices encouraging gas-to-coal switching,” S&P coal analyst Wendy Schallom told me in an email.
“Stronger power demand, rising natural gas prices, delayed coal retirements, stockpiles trending lower, and strong thermal coal exports are vital to U.S. coal revival in 2025 and 2026.”
And we’re all going to be paying the price.
Rural Marylanders have asked for the president’s help to oppose the data center-related development — but so far they haven’t gotten it.
A transmission line in Maryland is pitting rural conservatives against Big Tech in a way that highlights the growing political sensitivities of the data center backlash. Opponents of the project want President Trump to intervene, but they’re worried he’ll ignore them — or even side with the data center developers.
The Piedmont Reliability Project would connect the Peach Bottom nuclear plant in southern Pennsylvania to electricity customers in northern Virginia, i.e.data centers, most likely. To get from A to B, the power line would have to criss-cross agricultural lands between Baltimore, Maryland and the Washington D.C. area.
As we chronicle time and time again in The Fight, residents in farming communities are fighting back aggressively – protesting, petitioning, suing and yelling loudly. Things have gotten so tense that some are refusing to let representatives for Piedmont’s developer, PSEG, onto their properties, and a court battle is currently underway over giving the company federal marshal protection amid threats from landowners.
Exacerbating the situation is a quirk we don’t often deal with in The Fight. Unlike energy generation projects, which are usually subject to local review, transmission sits entirely under the purview of Maryland’s Public Service Commission, a five-member board consisting entirely of Democrats appointed by current Governor Wes Moore – a rumored candidate for the 2028 Democratic presidential nomination. It’s going to be months before the PSC formally considers the Piedmont project, and it likely won’t issue a decision until 2027 – a date convenient for Moore, as it’s right after he’s up for re-election. Moore last month expressed “concerns” about the project’s development process, but has brushed aside calls to take a personal position on whether it should ultimately be built.
Enter a potential Trump card that could force Moore’s hand. In early October, commissioners and state legislators representing Carroll County – one of the farm-heavy counties in Piedmont’s path – sent Trump a letter requesting that he intervene in the case before the commission. The letter followed previous examples of Trump coming in to kill planned projects, including the Grain Belt Express transmission line and a Tennessee Valley Authority gas plant in Tennessee that was relocated after lobbying from a country rock musician.
One of the letter’s lead signatories was Kenneth Kiler, president of the Carroll County Board of Commissioners, who told me this lobbying effort will soon expand beyond Trump to the Agriculture and Energy Departments. He’s hoping regulators weigh in before PJM, the regional grid operator overseeing Mid-Atlantic states. “We’re hoping they go to PJM and say, ‘You’re supposed to be managing the grid, and if you were properly managing the grid you wouldn’t need to build a transmission line through a state you’re not giving power to.’”
Part of the reason why these efforts are expanding, though, is that it’s been more than a month since they sent their letter, and they’ve heard nothing but radio silence from the White House.
“My worry is that I think President Trump likes and sees the need for data centers. They take a lot of water and a lot of electric [power],” Kiler, a Republican, told me in an interview. “He’s conservative, he values property rights, but I’m not sure that he’s not wanting data centers so badly that he feels this request is justified.”
Kiler told me the plan to kill the transmission line centers hinges on delaying development long enough that interest rates, inflation and rising demand for electricity make it too painful and inconvenient to build it through his resentful community. It’s easy to believe the federal government flexing its muscle here would help with that, either by drawing out the decision-making or employing some other as yet unforeseen stall tactic. “That’s why we’re doing this second letter to the Secretary of Agriculture and Secretary of Energy asking them for help. I think they may be more sympathetic than the president,” Kiler said.
At the moment, Kiler thinks the odds of Piedmont’s construction come down to a coin flip – 50-50. “They’re running straight through us for data centers. We want this project stopped, and we’ll fight as well as we can, but it just seems like ultimately they’re going to do it,” he confessed to me.
Thus is the predicament of the rural Marylander. On the one hand, Kiler’s situation represents a great opportunity for a GOP president to come in and stand with his base against a would-be presidential candidate. On the other, data center development and artificial intelligence represent one of the president’s few economic bright spots, and he has dedicated copious policy attention to expanding growth in this precise avenue of the tech sector. It’s hard to imagine something less “energy dominance” than killing a transmission line.
The White House did not respond to a request for comment.
Plus more of the week’s most important fights around renewable energy.
1. Wayne County, Nebraska – The Trump administration fined Orsted during the government shutdown for allegedly killing bald eagles at two of its wind projects, the first indications of financial penalties for energy companies under Trump’s wind industry crackdown.
2. Ocean County, New Jersey – Speaking of wind, I broke news earlier this week that one of the nation’s largest renewable energy projects is now deceased: the Leading Light offshore wind project.
3. Dane County, Wisconsin – The fight over a ginormous data center development out here is turning into perhaps one of the nation’s most important local conflicts over AI and land use.
4. Hardeman County, Texas – It’s not all bad news today for renewable energy – because it never really is.