You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
Predicting the location and severity of thunderstorms is at the cutting edge of weather science. Now funding for that science is at risk.
Tropical Storm Barry was, by all measures, a boring storm. “Blink and you missed it,” as a piece in Yale Climate Connections put it after Barry formed, then dissipated over 24 hours in late June, having never sustained wind speeds higher than 45 miles per hour. The tropical storm’s main impact, it seemed at the time, was “heavy rains of three to six inches, which likely caused minor flooding” in Tampico, Mexico, where it made landfall.
But a few days later, U.S. meteorologists started to get concerned. The remnants of Barry had swirled northward, pooling wet Gulf air over southern and central Texas and elevating the atmospheric moisture to reach or exceed record levels for July. “Like a waterlogged sponge perched precariously overhead, all the atmosphere needed was a catalyst to wring out the extreme levels of water vapor,” meteorologist Mike Lowry wrote.
More than 100 people — many of them children — ultimately died as extreme rainfall caused the Guadalupe River to rise 34 feet in 90 minutes. But the tragedy was “not really a failure of meteorology,” UCLA and UC Agriculture and Natural Resources climate scientist Daniel Swain said during a public “Office Hours” review of the disaster on Monday. The National Weather Service in San Antonio and Austin first warned the public of the potential for heavy rain on Sunday, June 29 — five days before the floods crested. The agency followed that with a flood watch warning for the Kerrville area on Thursday, July 3, then issued an additional 21 warnings, culminating just after 1 a.m. on Friday, July 4, with a wireless emergency alert sent to the phones of residents, campers, and RVers along the Guadalupe River.
The NWS alerts were both timely and accurate, and even correctly predicted an expected rainfall rate of 2 to 3 inches per hour. If it were possible to consider the science alone, the official response might have been deemed a success.
Of all the storm systems, convective storms — like thunderstorms, hail, tornadoes, and extreme rainstorms — are some of the most difficult to forecast. “We don’t have very good observations of some of these fine-scale weather extremes,” Swain told me after office hours were over, in reference to severe meteorological events that are often relatively short-lived and occur in small geographic areas. “We only know a tornado occurred, for example, if people report it and the Weather Service meteorologists go out afterward and look to see if there’s a circular, radial damage pattern.” A hurricane, by contrast, spans hundreds of miles and is visible from space.
Global weather models, which predict conditions at a planetary scale, are relatively coarse in their spatial resolution and “did not do the best job with this event,” Swain said during his office hours. “They predicted some rain, locally heavy, but nothing anywhere near what transpired.” (And before you ask — artificial intelligence-powered weather models were among the worst at predicting the Texas floods.)
Over the past decade or so, however, due to the unique convective storm risks in the United States, the National Oceanic and Atmospheric Administration and other meteorological agencies have developed specialized high resolution convection-resolving models to better represent and forecast extreme thunderstorms and rainstorms.
NOAA’s cutting-edge specialized models “got this right,” Swain told me of the Texas storms. “Those were the models that alerted the local weather service and the NOAA Weather Prediction Center of the potential for an extreme rain event. That is why the flash flood watches were issued so early, and why there was so much advanced knowledge.”
Writing for The Eyewall, meteorologist Matt Lanza concurred with Swain’s assessment: “By Thursday morning, the [high resolution] model showed as much as 10 to 13 inches in parts of Texas,” he wrote. “By Thursday evening, that was as much as 20 inches. So the [high resolution] model upped the ante all day.”
Most models initialized at 00Z last night indicated the potential for localized excessive rainfall over portions of south-central Texas that led to the tragic and deadly flash flood early this morning. pic.twitter.com/t3DpCfc7dX
— Jeff Frame (@VORTEXJeff) July 4, 2025
To be any more accurate than they ultimately were on the Texas floods, meteorologists would have needed the ability to predict the precise location and volume of rainfall of an individual thunderstorm cell. Although models can provide a fairly accurate picture of the general area where a storm will form, the best current science still can’t achieve that level of precision more than a few hours in advance of a given event.
Climate change itself is another factor making storm behavior even less predictable. “If it weren’t so hot outside, if it wasn’t so humid, if the atmosphere wasn’t holding all that water, then [the system] would have rained and marched along as the storm drifted,” Claudia Benitez-Nelson, an expert on flooding at the University of South Carolina, told me. Instead, slow and low prevailing winds caused the system to stall, pinning it over the same worst-case-scenario location at the confluence of the Hill Country rivers for hours and challenging the limits of science and forecasting.
Though it’s tempting to blame the Trump administration cuts to the staff and budget of the NWS for the tragedy, the local NWS actually had more forecasters on hand than usual in its local field office ahead of the storm, in anticipation of potential disaster. Any budget cuts to the NWS, while potentially disastrous, would not go into effect until fiscal year 2026.
The proposed 2026 budget for NOAA, however, would zero out the upkeep of the models, as well as shutter the National Severe Storms Laboratory in Norman, Oklahoma, which studies thunderstorms and rainstorms, such as the one in Texas. And due to the proprietary, U.S.-specific nature of the high-resolution models, there is no one coming to our rescue if they’re eliminated or degraded by the cuts.
The impending cuts are alarming to the scientists charged with maintaining and adjusting the models to ensure maximum accuracy, too. Computationally, it’s no small task to keep them running 24 hours a day, every day of the year. A weather model doesn’t simply run on its own indefinitely, but rather requires large data transfers as well as intakes of new conditions from its network of observation stations to remain reliable. Although the NOAA high-resolution models have been in use for about a decade, yearly updates keep the programs on the cutting edge of weather science; without constant tweaks, the models’ accuracy slowly degrades as the atmosphere changes and information and technologies become outdated.
It’s difficult to imagine that the Texas floods could have been more catastrophic, and yet the NOAA models and NWS warnings and alerts undoubtedly saved lives. Still, local Texas authorities have attempted to pass the blame, claiming they weren’t adequately informed of the dangers by forecasters. The picture will become clearer as reporting continues to probe why the flood-prone region did not have warning sirens, why camp counselors did not have their phones to receive overnight NWS alarms, why there were not more flood gauges on the rivers, and what, if anything, local officials could have done to save more people. Still, given what is scientifically possible at this stage of modeling, “This was not a forecast failure relative to scientific or weather prediction best practices. That much is clear,” Swain said.
As the climate warms and extreme rainfall events increase as a result, however, it will become ever more crucial to have access to cutting-edge weather models. “What I want to bring attention to is that this is not a one-off,” Benitez-Nelson, the flood expert at the University of South Carolina, told me. “There’s this temptation to say, ‘Oh, it’s a 100-year storm, it’s a 1,000-year storm.’”
“No,” she went on. “This is a growing pattern.”
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Using the Supercharger network with a non-Tesla is great — except for one big, awkward problem.
You can drive your life away and never notice the little arrow on the dashboard — the one next to the fuel canister icon that points out which side of the car the gas cap is on. The arrow is a fun piece of everyday design that has inspired many a know-it-all friend or TikTok. But while the intel it relays can be helpful if you’re driving a rental car, or are just generally forgetful, it doesn’t really matter in the grand scheme what side your fuel filler is on. Service stations are so big that there’s generally enough space to park at an open pump in whatever orientation a vehicle demands.
That’s not quite the case with electric cars.
When I test-drove the new Hyundai Ioniq 9 this summer, the industrial designers had included their own version of the little arrow to point out the location of the EV’s charging port. In the Ioniq 9’s case, it’s on the passenger’s side, the opposite of where you’d find the port on a Tesla. Turns out, that’s a problem. On our trip from L.A. to San Jose, Hyundai's navigation system directed me to a busy Tesla Supercharger just off the interstate in the parking lot of a Denny’s. But because of the big EV’s backward port placement, I needed two empty stalls next to each other — both of which I wound up blocking when I backed in to charge. The episode is an example of how we screwed over the present by not thinking hard enough when we built the infrastructure of the recent past.
Let’s back up. In the opening stage of the EV race, the charging question was split between Tesla and everybody else. The other electric carmakers adopted a few shared plug standards. But just like with gas cars, where the left-or-right placement of the gas cap seemed to vary arbitrarily vehicle to vehicle, there was no standardized placement of the charging port. Because all manner of different EVs pulled in, companies like Electrify America and Chargepoint built their chargers with cords long enough to reach either side of a car.
Tesla, meanwhile, built out its excellent but vertically integrated Supercharger network with only Tesla cars in mind. In most cases, a station amounted to eight or more parking spaces all in a row. The cable that came off each charging post was only long enough to reach the driver’s side rear, where all the standardized ports on Teslas can be found. The thinking made sense at the time. Other EVs weren’t allowed to use the Supercharger network. Why, then, would you pay for extra cabling to reach the other side of the vehicle?
It became a big issue late in 2022. At that point, Musk made Tesla’s proprietary plug an open-source standard and encouraged the other carmakers to adopt it. One by one they fell in line. The other car companies pledged to use the newly renamed North American Charging Standard, or NACS, in their future EVs. Then Tesla began to open many, but not all, of its stations to Rivians, Hyundais, and other electric cars.
Which leads us to today. The Ioniq 9, which began deliveries this summer, comes with a NACS port. This allows drivers to use Tesla stations without the need to keep an annoying dongle handy. But because Hyundai put the port on the opposite side, the car is oriented in the opposite direction from the way hundreds or thousands of Supercharger stations are set up. Suppose you find an empty spot between two Teslas and back in — the plug that could reach your passenger’s side port actually belongs to the stall next to you, and is in use by the EV parked there. The available cord, the one meant for the stall you actually parked in, can’t reach over to the passenger’s side.
The result is a mess. Find two open stalls next to each other and you can make it work, though it means you’re taking up both of them (stealing the cord meant for the neighboring stall and blocking the cord meant for the one you’re parked in). At giant stations with dozens of plugs, this is no big deal. At smaller ones with just 12 or 16 plugs, it’s a nuisance. I’ve walked out and moved the Rivian I was test-driving before I had all the electricity I wanted because I felt guilty about blocking two stalls. To avoid this breach of etiquette you might need to park illegally, leaving your EV in a non-spot or in a place where it’s blocking the sidewalk just so it can reach the plug. (Says Tesla FAQ: “In some cases you might have to park over the line in order to charge comfortably. Avoid parking diagonally to reach the cable and try to obstruct as few charge posts as possible.)
Some relief from this short-sightnedness is coming. Tesla’s new “V4” stations that are currently opening around the world are built with this complexity in mind and include longer cables and an orientation meant to reach either side of the vehicle. The buildout of EV chargers of all kinds is slated to continue even with the Trump administration’s opposition to funding them, and new stations should be flexible to any kind of electric car. And the idea of making sure EVs of any size and shape can charge is picking up steam. For example, many of the stations in Rivian’s Adventure Network include at least one stall where the charging post is off to the side of an extra-long parking space so that an EV towing a trailer can reach its charging port.
Yet for now, we’re stuck with what we’ve already built. There are more than 2,500 Tesla Supercharger stations in the U.S., representing more than 30,000 individual plugs, and most of those were built with the V2 and V3 versions of Tesla’s technology that have this orientation problem. For years to come, many of those stations will be the best or only option for non-Tesla EVs on a road trip, which means we’re all in for some extra inconvenience.
On $20 billion in lost projects, Alligator Alcatraz’s closure, and Amazon state’s rally
Current conditions: The highest wave measured from Hurricane Erin was 45 feet by a buoy located 150 miles off North Carolina’s Cape Hetteras • Intense rainfall is flooding Rajasthan in India • Wildfires continue raging across North America and southern Europe.
The Trump administration issued a stop-work order to halt construction of Orsted’s flagship project off the coast of Rhode Island. The Bureau of Ocean Energy Management halted work on the Revolution Wind project while its regulators were “seeking to address concerns related to the protection of national security interests of the United States,” a letter from the agency stated. The project was nearly completed, and already connected to the grid. The Danish state-owned Orsted said it was “evaluating all options to resolve the matter expeditiously.”
Earlier this month, the company put out a bid for $9.4 billion from the stock market to fund its work in the U.S. amid President Donald Trump’s crackdown. As Heatmap’s Matthew Zeitlin wrote of the sale, “While the market had been expecting Orsted to raise capital in some form, the scale of the raise is about twice what was anticipated,” causing its stock to plunge almost 30%. The White House has aggressively targeted policies that benefit wind energy in recent weeks. Following the Friday announcement, shares in Orsted tumbled 17% to a record low.
Trump’s clampdown on wind and solar has sent the industry spiraling in recent weeks as federal agencies limit access to clean energy tax credits and rework rules to disfavor the industry’s two largest sources of energy. Already, $18.6 billion worth of clean energy projects have been canceled this year, compared to just $827 million last year, according to data from Atlas Public Policy’s Clean Economy Tracker cited in the Financial Times.
Trump has blamed renewables for the rising price of electricity. But data Matthew covered last week showed that renewables are, if anything, correlated with lower prices. Instead, he wrote, at the “top of the list” of reasons electricity prices are surging “is the physical reality of the grid itself,” the poles and wires required to send energy into people’s homes and businesses. “Beyond that, extreme weather, natural gas prices, and data center-induced demand growth all play a part.”
The entrance to Florida's state-managed immigrant detention facility. Joe Raedle/Getty Images
Together with the state of Florida, the Trump administration rushed to build what it calls “Alligator Alcatraz,” a detention facility designed to hold several thousand migrants at a time in southern Florida. In its haste to complete the facility, however, the government failed to conduct the proper environmental reviews, according to a federal judge who ordered its closure late last week, The Wall Street Journal reported. Back in June, a pair of nonprofits filed a lawsuit alleging that the government had failed to conduct assessments of what impact the facility would have on endangered animals such as the Florida panther and the Florida bonneted bat. The Miccosukee Tribe of Indians of Florida later joined the lawsuit.
The Trump administration argued that the law in question, the National Environmental Policy Act, only applies to federal projects, whereas this one was state-driven, an argument Judge Kathleen Williams rejected, according to the Journal. “Every Florida governor, every Florida senator, and countless local and national political figures, including presidents, have publicly pledged their unequivocal support for the restoration, conservation, and protection of the Everglades,” she wrote. “This Order does nothing more than uphold the basic requirements of legislation designed to fulfill those promises.”
The eight countries that ring the Amazon rainforest pledged support over the weekend for a global pool of financing for conservation. In a joint declaration, the Amazonian nations — Bolivia, Brazil, Colombia, Ecuador, Guyana, Peru, Suriname, and Venezuela — expressed support for preserving the rainforest but stopped short of endorsing any curbs on fossil fuels. The statement comes as South America has emerged as the world’s hottest oil patch, with new discoveries moving forward off the coasts of Guyana and Brazil and Argentina advancing plans for a fracking boom.
“Abrupt changes” like the precipitous loss of sea ice are unfolding in Antarctica, highlighting the growing threat global warming poses to the frozen continent, according to a new paper in the journal Nature. These changes could push the Antarctic ecosystem past a point of no return, the authors wrote.
“We’re seeing a whole range of abrupt and surprising changes developing across Antarctica, but these aren’t happening in isolation,” climate scientist Nerilie Abram, lead author of the paper, told Grist. “When we change one part of the system, that has knock-on effects that worsen the changes in other parts of the system. And we’re talking about changes that also have global consequences.”
Bad news for vegans who evangelize their diets on good health grounds: New research found no increased risk of death “associated with higher intake of animal protein. In fact, the data showed a modest but significant reduction in cancer-related mortality among those who ate more animal protein.” That, however, doesn’t change the huge difference in emissions between red meat and plant food products.
Ambient Carbon is doing the methane equivalent of point source carbon capture in dairy barns.
In the world of climate and energy, “emissions” is often shorthand for carbon dioxide, the most abundant anthropogenic greenhouse gas in the world. Similarly, talk of emissions capture and removal usually centers on the growing swath of technologies that either prevent CO2 from entering the atmosphere or pull it back out after the fact.
Discussions and frameworks for reducing methane, which is magnitudes more potent than CO2 in the short-term, have been far less common — but the potential impact could be huge.
“If you can accelerate the decrease of methane in the atmosphere, you actually could have a much more significant climate impact, much faster than with CO2,” Gabrielle Dreyfus, chief scientist at the Institute for Governance & Sustainable Development, told me. “People often talk about gigatons of CO2 removal. But because of the potency of methane, for a similar level of temperature impact, you’re talking about megatons.”
Over the past year or so, this conversation has finally started to gain traction. Last October, the National Academies of Sciences, Engineering, and Medicine released a report on atmospheric methane removal, recommending that the U.S. develop a research agenda for methane removal technologies and establish methodologies to assess their impacts. Dreyfus chaired the committee that authored the report.
And one startup, at least — Denmark-based Ambient Carbon — is trying to commercialize its methane-zapping tech. Last week, the company announced that it had successfully trialed its “methane eradication photochemical system” at a dairy barn in Denmark, eliminating the majority of methane from the barn’s air. It’s also aiming to deploy a prototype in the U.S., at a farm in Indiana, by year’s end.
The way the company’s process works is more akin to point source carbon capture, in which emissions are pulled from a smokestack, than it is to something like direct air capture, in which carbon dioxide is removed from ambient air. Inside a dairy barn, cows are continually belching methane, producing high concentrations of the gas that are typically vented into the atmosphere. Instead, Ambient Carbon captures this noxious air from the barn’s ventilation ducts and brings it into an enclosed reactor.
Inside the reactor, which uses electricity from the grid, UV light activates chlorine molecules, splitting their chemical bonds to form unstable radicals. These radicals then react with methane, breaking down the potent gas and converting it into CO2, water, and other byproducts. The whole process mimics the natural destruction of atmospheric methane, which would normally take a decade or more, while Ambient Carbon’s system does it in a matter of seconds. Much of the chlorine gets recycled back into the process, and the CO2 is released into the air.
That might sound less than ideal. Famously, carbon dioxide is bad. This molecule alone is responsible for two-thirds of all human-caused global warming. But because methane is over 80 times as potent as CO2 over a 20-year timeframe, and since it would eventually break down into carbon dioxide in the atmosphere anyway, accelerating that inevitable process turns out to be a net good for the climate.
“The amount of CO2 produced by methane when it oxidizes has about 50 times smaller climate effect than the methane that produced it,” Zeke Hausfather, a climate scientist and climate research lead at Stripe, told me. “So you get a 98% reduction in the warming effects by converting methane to CO2, which I think is a pretty good deal.”
As he sees it, preventing methane emissions in the first place or destroying the molecules before they’re released, as Ambient Carbon is doing, is far more impactful than pursuing after-the-fact atmospheric methane removal. Because while CO2 can linger in the air for centuries — making removal a necessity for near-term planetary cooling — when it comes to methane, “if you cut emissions, you cool the planet pretty quickly, because all that previous warming from methane goes away over the course of a decade or two.”
Agriculture represents 40% of global methane emissions, the largest single source, making the industry a ripe target for de-methane-ization. Ambient Carbon’s tech is only really effective when methane concentrations are relatively high, the company’s CSO, Matthew Johnson, told me — which still leaves a large addressable market given that in many parts of the world, cows are mostly kept in dairy barns, where methane accumulates.
In its trial, Ambient Carbon’s system eliminated up to 90% of dairy barn methane at concentrations ranging from 4.3 parts per million to 44 parts per million. But while the system can theoretically operate at the lower end of that range, Johnson told me it’s only truly energy efficient at 20 parts per million and above. “It’s a question of cost benefit, because we could remove 99% [of the methane from dairy barns] but if you do that, that marginal cost is more energy,” Johnson explained, telling me that the company’s system will likely aim to remove between 80% to 90% of barn methane.
One reason methane destruction and removal technology hasn’t gained much traction is that capturing methane — whether from the atmosphere, a smokestack, or a ventilation duct — is far more challenging than capturing CO2, given that it’s so much less prevalent in the atmosphere. Atmospheric methane is relatively diffuse, with an average concentration of just about 2 parts per million, compared with roughly 420 parts per million for CO2. “I heard the analogy used that if pulling carbon dioxide out of the atmosphere is finding a needle in a haystack, pulling methane out of the atmosphere is pulling dust off the needle in that haystack,” Dreyfus told me.
Because of methane’s relative chemical stability, removing it from the air also requires a strong oxidant, such as chlorine radicals, to break it down. CO2 on the other hand, can be separated from the air with sorbents or membranes, which is a technically simpler process.
Other nascent approaches to methane destruction and removal include introducing chlorine radicals into the open atmosphere and adding soil amendments to boost the effectiveness of natural methane sinks. Among these options, Ambient Carbon’s approach is the furthest along, most well-understood, and likely also lowest-risk. After its successful field trial, “there is not much uncertainty remaining about whether or not this does the claimed thing,” Sam Abernethy, a methane removal scientist at the nonprofit Spark Climate Solutions, told me. “The main questions remaining are whether they can be cost-effective at progressively lower concentrations, whether they can get more methane destroyed per energy input. And that’s something they’ve been improving every year since they started.”
Venture firms have yet to jump onboard though. Thus far, Ambient Carbon’s funding has come from agricultural partners such as Danone North America and Benton Group Dairies, which are working with the company to conduct its field trials. Additional collaboration and financial support comes from organizations such as the Hofmansgave Foundation, a Danish philanthropic group, and Innovation Fund Denmark. Johnson told me the startup also has a number of unnamed angel investors.
Whether or not this tech could ever become efficient enough to tackle more dilute methane emissions — and thus make true atmospheric methane removal feasible — remains highly uncertain. Questions also remain about how these technologies, if proven to be workable, would ultimately be able to scale. For instance, would methane destruction and removal depend more on government policies and regulations, or on market-based incentives?
In the short term, voluntary corporate commitments appear to be the main drivers of interest when it comes to methane destruction specifically. “A lot of food companies have made public pledges that they’re going to reduce their greenhouse gas emissions,” Johnson told me. As he noted, ubiquitous brands such as Kraft Heinz, General Mills, Danone, and Starbucks have all joined the Dairy Methane Action Alliance, which aims to “accelerate action and ambition to drive down methane emissions across dairy supply chains,” according to its website.
The way Ambient Carbon envisions this market working, its food industry partners would be the ones to encourage farms to buy the startup’s methane-destroying units, and would pay farmers a premium for producing low-emissions products. This would enable farmers to cover the system’s cost within five years, and eventually generate additional revenue. Whether the food companies would pass the green premium onto consumers, however, remains to be seen.
But as with the carbon dioxide removal sector, voluntary corporate commitments and carbon crediting schemes will likely only go so far. “Most of what’s going to drive methane elimination is going to be policy,” Hausfather told me. Denmark, where Ambient Carbon conducted its first trial, is set to become the first country in the world to implement a tax on agricultural emissions, starting in 2030. Europe also has a comprehensive greenhouse gas reduction framework, as do states such as California, Washington, and New York.
“It’s such a low-hanging fruit of climate impacts that it’s hard to imagine it’s not going to be regulated pretty substantially in the future,” Hausfather told me. But stringent regulatory requirements are often shaped by the technologies that have been established as effective. And in that sense, what Ambient Carbon is doing today could help pave the way for the ambitious methane targets of tomorrow.
“Moving from a lot of the voluntary pledges that we have towards more mandatory requirements I think is going to have a really important role to play,” Dreyfus told me. “But I think it’s going to be easier if we have more proven technologies to get there.”