You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
With the ongoing disaster approaching its second week, here’s where things stand.

A week ago, forecasters in Southern California warned residents of Los Angeles that conditions would be dry, windy, and conducive to wildfires. How bad things have gotten, though, has taken everyone by surprise. As of Monday morning, almost 40,000 acres of Los Angeles County have burned in six separate fires, the biggest of which, Palisades and Eaton, have yet to be fully contained. The latest red flag warning, indicating fire weather, won’t expire until Wednesday.
Many have questions about how the second-biggest city in the country is facing such unbelievable devastation (some of these questions, perhaps, being more politically motivated than others). Below, we’ve tried to collect as many answers as possible — including a bit of good news about what lies ahead.
A second Santa Ana wind event is due to set in Monday afternoon. “We’re expecting moderate Santa Ana winds over the next few days, generally in the 20 to 30 [mile per hour] range, gusting to 50, across the mountains and through the canyons,” Eric Drewitz, a meteorologist with the Forest Service, told me on Sunday. Drewitz noted that the winds will be less severe than last week’s, when the fires flared up, but he also anticipates they’ll be “more easterly,” which could blow the fires into new areas. A new red flag warning has been issued through Wednesday, signaling increased fire potential due to low humidity and high winds for several days yet.
If firefighters can prevent new flare-ups and hold back the fires through that wind event, they might be in good shape. By Friday of this week, “it looks like we could have some moderate onshore flow,” Drewitz said, when wet ocean air blows inland, which would help “build back the marine layer” and increase the relative humidity in the region, decreasing the chances of more fires. Information about the Santa Anas at that time is still uncertain — the models have been changing, and the wind is tricky to predict the strength of so far out — but an increase in humidity will at least offer some relief for the battered Ventura and Orange Counties.
The Palisades Fire, the biggest in L.A., ripped through the hilly and affluent area between Santa Monica and Malibu, including the Pacific Palisades neighborhood, the second-most expensive zip code in Los Angeles and home to many celebrities. Structures in Big Rock, a neighborhood in Malibu, have also burned. The fire has also encroached on the I-405 and the Getty Villa, and destroyed at least two homes in Mandeville Canyon, a neighborhood of multimillion-dollar homes. Students at nearby University of California, Los Angeles, were told on Friday to prepare for a possible evacuation.
The Eaton Fire, the second biggest blaze in the area, has killed 16 people in Altadena, a neighborhood near Pasadena, according to the Los Angeles Times, making it one of the deadliest fires in the modern history of California.
The 1,000-acre Kenneth fire is 100% contained but still burning near Calabasas and the gated community of Hidden Hills. The Hurst Fire has burned nearly 800 acres and is 89% contained and is still burning near Sylmar, the northernmost neighborhood in L.A. Though there are no evacuation notices for either the Kenneth or the Hurst fires, residents in the L.A. area should monitor the current conditions as the situation continues to be fluid and develop.
The 43-acre Sunset Fire, which triggered evacuations last week in Hollywood and Hollywood Hills, burned no homes and is 100% contained.
The Lidia Fire, which ignited in a remote area south of Acton, California, on Wednesday afternoon, burned 350 acres of brush and is 100% contained.
It can take years to determine the cause of a fire, and investigations typically don’t begin until after the fire is under control and the area is safe to reenter, Edward Nordskog, a retired fire investigator from the Los Angeles Sheriff’s Department, told Heatmap’s Emily Pontecorvo. He also noted, however, that urban fires are typically easier to pinpoint the cause of than wildland fires due to the availability of witnesses and surveillance footage.
The vast majority of wildfires, 85%, are caused by humans. So far, investigators have ruled out lightning — another common fire-starter — because there were no electrical storms in the area when the fires started. In the case of the Palisades Fire, there were no power lines in the area of the ignition, though investigators are now looking into an electrical transmission tower in Eaton Canyon as the possible cause of the deadly fire in Altadena. There have been rumors that arsonists started the fires, but investigators say that scenario is also pretty unlikely due to the spread of the fires and how remote the ignition areas are.
Officially, 24 people have died, but that tally is likely to rise. California Governor Gavin Newsom said Sunday that he expects “a lot more” deaths will be added to the total in the coming days as search efforts continue.
Incoming President Donald Trump slammed the response to the L.A. fires in a Truth Social post on Sunday morning: “This is one of the worst catastrophes in the history of our Country,” he wrote. “They just can’t put out the fires. What’s wrong with them?”
Though there is much blame going around — not all of it founded in reality — the challenges facing firefighters are immense. Last week, because of strong Santa Ana winds, fire crews could not drop suppressants like water or chemical retardant on the initial blazes. (In strong winds, water and retardant will blow away before they reach the flames on the ground.)
Fighting a fire in an urban or suburban area is also different from fighting one in a remote, wild area. In a true wildfire, crews don’t use much water; firefighters typically contain the blazes by creating breaks — areas cleared of vegetation that starve a fire of fuel and keep it from spreading. In an urban or suburban event, however, firefighters can’t simply hack through a neighborhood, and typically have to use water to fight structure fires. Their priority also shifts from stopping the fire to evacuating and saving people, which means putting out the fire itself has to wait.
What’s more, the L.A. area faced dangerous fire weather going into last week — with wind gusts up to 100 miles per hour and dry air — and the persistence of the Santa Ana winds during firefighting operations through the weekend made it extremely difficult for emergency managers to gain a foothold.
Trump and others have criticized Los Angeles for being unprepared for the fires, given reports that some fire hydrants ran dry or had low pressure during operations in Pacific Palisades. According to the Los Angeles Department of Water and Power, about 20% of hydrants were affected, mostly at higher elevations.
The problem isn’t a lack of preparation, however. It’s that the L.A. wildfires are so large and widespread, the county’s preparations were quickly overwhelmed. “We’re fighting a wildfire with urban water systems, and that is really challenging,” Los Angeles Department of Water and Power CEO Janisse Quiñones said in a news conference last week. When houses burn down, water mains can break open. Civilians also put a strain on the system when they use hoses or sprinkler systems to try to protect their homes.
On Sunday, Judy Chu, the Democratic lawmaker representing Altadena, confirmed that fire officials had told her there was enough water to continue the battle in the days ahead. “I believe that we're in a good place right now,” she told reporters. Newsom, meanwhile, has responded to criticism over the water failure by ordering an investigation into the weak or dry hydrants.
So-called “super soaker” planes have had no problem with water access; they’re scooping directly from the ocean.
Yes. Although aerial support was grounded in the early stages of the wildfires due to severe Santa Ana winds, flights resumed during lulls in the storms last week.
There is a misconception, though, that water and retardant drops “put out” fires; they don’t. Instead, aerial support suppresses a fire so crews can get in close and use traditional methods, like cutting a fire break or spraying water. “All that up in the air, all that’s doing is allowing the firefighters [on the ground] a chance to get in,” Bobbie Scopa, a veteran firefighter and author of the memoir Both Sides of the Fire Line, told me last week.
With winds expected to pick up early this week, aerial firefighting operations may be grounded again. “If you have erratic, unpredictable winds to where you’ve got a gust spread of like 20 to 30 knots,” i.e. 23 to 35 miles per hour, “that becomes dangerous,” Dan Reese, a veteran firefighter and the founder and president of the International Wildfire Consulting Group, told me on Friday.
Because of the direction of the Santa Ana winds, wildfire smoke should mostly blow out to sea. But as winds shift, unhealthy air can blow into populated areas, affecting the health of residents.
Wildfire smoke is unhealthy, period, but urban and suburban smoke like that from the L.A. fires can be particularly detrimental. It’s not just trees and brush immolating in an urban fire, it’s also cars, and batteries, and gas tanks, and plastics, and insulation, and other nasty, chemical-filled things catching fire and sending fumes into the air. PM2.5, the inhalable particulates from wildfire smoke, contributes to thousands of excess deaths annually in the U.S.
You can read Heatmap’s guide to staying safe during extreme smoke events here.
“The bad news is, I’m not seeing any rain chances,” Drewitz, the Forest Service meteorologist, told me on Sunday. Though the marine layer will bring wetter air to the Los Angeles area on Friday, his models showed it’ll be unlikely to form precipitation.
Though some forecasters have signaled potential rain at the end of next week, the general consensus is that the odds for that are low, and that any rain there may be will be too light or short-lived to contribute meaningfully to extinguishing the fires.
The chaparral shrublands around Los Angeles are supposed to burn every 30 to 130 years. “There are high concentrations of terpenes — very flammable oils — in that vegetation; it’s made to burn,” Scopa, the veteran firefighter, told me.
What isn’t normal, though, is the amount of rain Los Angeles got ahead of this past spring — 52.46 inches in the preceding two years, the wettest period in the city’s history since the late 1800s — which was followed by a blisteringly hot summer and a delayed start to this year’s rainy season. Since October, parts of Southern California have received just 10% of their normal rainfall
This “weather whiplash” is caused by a warmer atmosphere, which means that plants will grow explosively due to the influx of rain and then dry out when the drought returns, leaving lots of dry fuels ready and waiting for a spark. “This is really, I would argue, a signature of climate change that is going to be experienced almost everywhere people actually live on Earth,” Daniel Swain, a climate scientist at the University of California, Los Angeles, who authored a new study on the pattern, told The Washington Post.
We know less about how climate change may affect the Santa Anas, though experts have some theories.
At least 12,000 structures have burned so far in the fires, which is already exacerbating the strain on the Los Angeles housing market — one of the country’s tightest even before the fires — as thousands of displaced people look for new places to live. “Dozens and dozens of people are going after the same properties,” one real estate agent told the Los Angeles Times. The city has reminded businesses that price gouging — including raising rental prices more than 10% — during an emergency is against the law.
Los Angeles had a shortage of about 370,000 homes before the fires, and between 2021 and 2023, the county added fewer than 30,000 new units per year. Recovery grants and federal aid can lag, and it often takes more than two years for even the first Housing and Urban Development Disaster Recovery Grants’ expenditures to go out.
My colleague Matthew Zeitlin wrote for Heatmap that the economic impact of the Los Angeles fire is already much higher than that of other fires, such as the 2018 Camp fire, partly because of the value of the Pacific Palisades real estate.
The wildfires may “deal a devastating blow to [California’s] fragile home insurance market,” Heatmap’s Matthew Zeitlin wrote last week. In recent years, home insurers have left California or declined to write new policies, at least partially due to the increased risk of wildfires in the state.
Depending on the extent of the damage from the fires, the coffers of California’s FAIR Plan — which insures homeowners who can’t get insurance otherwise, including many in Pacific Palisades and Altadena — could empty, causing it to seek money from insurers, according to the state’s regulations. As Zeitlin writes, “This would mean that Californians who were able to buy private insurance — because they don’t live in a region of the state that insurers have abandoned — could be on the hook for massive wildfire losses.”
First and foremost, sign up for all relevant emergency alerts. Make sure to turn on the sound on your phone and keep it near you in case of a change in conditions. Pack a “go bag” with essentials and consider filling your gas tank now so that you can evacuate at a moment’s notice if needed. Read our guide on what to do if you get a pre-evacuation or an evacuation notice ahead of time so that you’re not scrambling for information if you get an alert.
The free Watch Duty app has become a go-to resource for people affected by the fires, including friends and family of Angelenos who may themselves be thousands of miles away. The app provides information on fire perimeters, evacuation notices, and power outages. Its employees pull information directly from emergency responders’ radio broadcasts and sometimes beat official sources to disseminating it. If you need an endorsement: Emergency responders rely on the app, too.
There are many scams in the wake of disasters as crooks look to take advantage of desperate people — and those who want to help them. To play it safe, you can use a hub like the one established by GoFundMe, which is actively vetting campaigns related to the L.A. fires. If you’re looking to volunteer your time, make a donation of clothing or food, or if you’re able to foster animals the fire has displaced, you can use this handy database from the Mutual Aid Network L.A. There are also many national organizations, such as the Red Cross, that you can connect with if you want to help.
The City of Los Angeles and the Los Angeles Fire Department have asked that do-gooders not bring donations directly to fire stations or shelters; such actions can interfere with emergency operations. Their website provides more information about how you can help — productively — on their website.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Plus news on cloud seeding, fission for fusion, and more of the week’s biggest money moves.
From beaming solar power down from space to shooting storm clouds full of particles to make it rain, this week featured progress across a range of seemingly sci-fi technologies that have actually been researched — and in some cases deployed — for decades. There were, however, few actual funding announcements to speak of, as earlier-stage climate tech venture funds continue to confront a tough fundraising environment.
First up, I explore Meta’s bet on space-based solar as a way to squeeze more output from existing solar arrays to power data centers. Then there’s the fusion startup Zap Energy, which is shifting its near-term attention toward the more established fission sector. Meanwhile, a weather modification company says it’s found a way to quantify the impact of cloud seeding — a space-age sounding practice that’s actually been in use for roughly 80 years. And amidst a string of disappointments for alternate battery chemistries, this week brings multiple wins for the sodium-ion battery sector.
One might presume that terrestrial solar paired with batteries would prove perfectly adequate for securing 24/7 clean energy moving forward, as global prices for panels and battery packs continue to fall. But the startup Overview Energy, which uses lasers to beam solar power from space directly onto existing solar arrays, thinks its space-based solar energy systems will prove valuable for powering large loads like data centers through the night. Now Meta is backing that premise, signing a first-of-its-kind agreement with Overview this week that secures early access for up to a gigawatt of capacity from the startup’s system.
Initial orbital demonstrations are slated for 2028, with commercial power delivery targeted for 2030. It’s an ambitious timeline, and certainly not the first effort to commercialize space-based solar, though prior analyses have generally concluded that while the physics check out, the economics and logistics don’t. Overview Energy thinks its found the core unlocks though: “geographic untethering,” which allows it to direct its beam to ground-based solar arrays anywhere in the world based on demand, and high-efficiency lasers capable of converting near-infrared light into electricity much more efficiently than pure sunlight.
The startup is targeting between $60 and $100 per megawatt-hour by 2035, at which point the goal is to be putting gigawatts of space solar on the grid. “It’s 5 o’clock somewhere,” Marc Berte, founder and CEO of Overview Energy, told me when I interviewed him last December. “You’re profitable at $100 bucks a megawatt-hour somewhere, instantaneously, all the time.”
Launch costs have also fallen sharply since the last serious wave of space-solar research, and Overview has already booked a 2028 launch with SpaceX. Solar power beamed from space also sidesteps two earthly constraints — land use and protracted grid interconnection timelines. So while this seemingly sci-fi vision remains unproven, it might be significantly more plausible than it once appeared. And Meta’s certainly not alone in taking that bet — Overview has already raised a $20 million seed round led by Lowercarbon Capital, Prime Movers Lab, and Engine Ventures.
Fusion startups are increasingly looking to nearer-term revenue opportunities as they work toward commercializing the Holy Grail of energy generation. Industry leader Commonwealth Fusion Systems is selling its high-temperature superconducting magnets to other developers, while other companies including Shine Technologies are generating income by producing nuclear isotopes for medical imaging. Now one startup, Zap Energy, is pushing that playbook a step further, announcing this week that it plans to develop fission reactors before putting its first fusion electrons on the grid.
Specifically, the startup is now attempting to develop small modular reactors — hardly a novel idea, as companies like Oklo, Kairos, and TerraPower have already secured significant public and private funding and struck major data center deals. Zap, however, thinks it can catch up to these new competitors in part by leveraging design commonalities between fission and fusion systems, including the use of liquid metals, engineered neutron environments, and high-power-density systems. “Fission and fusion are two expressions of the same underlying physics," Zap’s co-founder Benj Conwayby said in the press release. "This isn’t a pivot — by integrating them into a single platform, we can move faster, reduce risk, and build a more enduring company."
As the company outlines on its website, pursuing both pathways could eventually manifest in the development of a hybrid fusion-fission system, while also giving Zap practical experience interfacing with regulators and securing approvals. As The New York Times reports, the company is targeting an early 2030s timeline for its fission reactors, although Zap has yet to specify a timeline for fusion commercialization. Like so many of its peers, the company is eyeing data centers as a promising initial market, though bringing its first units online will likely require a significant influx of additional capital.
For all the concern surrounding geoengineering fixes for climate change such as solar radiation management, there’s one form of weather modification that’s been in use since the 1940s — cloud seeding. This practice typically involves flying planes into the center of storms and releasing flares that disperse a chemical called silver iodide into the clouds. This causes the water droplets within the clouds to freeze, increasing the amount of precipitation that falls as either rain or snow.
Alarming as it may sound for the uninitiated, there’s no evidence that silver iodide causes harm at current usage levels. But what has been far more difficult to pin down is efficacy — specifically, how much additional precipitation cloud seeding actually creates. That’s where the startup Rainmaker comes in. The company, which deploys unmanned drones to inject the silver iodide, says that its advanced radar and satellite systems indicate that its operations generated over 143 million gallons of additional freshwater in Oregon and Utah this year — roughly equivalent to the annual water usage of about 1,750 U.S. households. The findings have not yet been peer reviewed, but if accurate, they would make Rainmaker the first private company to quantify the impact of its cloud seeding operations.
Cloud seeding is already a well-oiled commercial business, with dozens of states, utility companies and ski resorts alike using it to increase snowfall in the drought-stricken American West and worldwide — China in particular spends tens of millions of dollars per year on the technology. Rainmaker has a particular aspiration: to help restore Utah’s Great Salt Lake, which has been shrinking since the 1980s amid rising water demand and increased evaporation driven by warmer temperatures.
In a press release, the company’s 26-year-old founder and CEO Augustus Doricko said, “With the newfound capability to measure our yields and quantify our results, Rainmaker will go forward and continue our mission to refill the Great Salt Lake, end drought in the American West and deliver water abundance wherever it is needed most around the world."
Sodium-ion batteries have long been touted as an enticing alternative — or at least complement — to lithium-ion systems for energy storage. They don’t rely on scarce and costly critical minerals like lithium, nickel, or cobalt, and have the potential to be far less flammable. The relatively nascent market also offers an opening for the U.S. to gain a foothold in this segment of the battery supply chain. But especially domestically, the industry has struggled to gain traction. Two sodium-ion startups, Natron and Bedrock Materials, both closed up shop last year as prices for lithium-iron-phosphate batteries cratered, eroding sodium-ion’s cost advantage, while the cost of manufacturing batteries in the U.S. constrained their ability to scale.
But one notable bright spot is the startup Alsym Energy, which announced this week that it has signed a letter-of-intent with long-duration energy storage company ESS Inc. for 8.5 gigawatt-hours of sodium-ion cells and modules, marking ESS’s expansion into the short and medium-duration storage market. Alsym’s CEO, Mukesh Chatter, told me this represents the largest deal for sodium-ion batteries in the U.S. to date — although it’s not yet a binding contract. Notably, it came just a day after the world’s largest-ever order for these batteries, as CATL disclosed a 60 gigawatt-hour sodium-ion agreement with energy storage integrator HyperStrong. Taken together, these partnerships suggest the sector is finally picking up durable traction both domestically and abroad.
ESS, however, is facing its own operational headwinds, nearly shuttering its Oregon manufacturing plant last year before securing an unexpected cash infusion and pivoting to a new, longer-duration storage product. Chatter remains exuberant about Alsym’s deal with the storage provider, however, telling me it represents a major proof point in terms of broader industry acceptance and an acknowledgement that “the benefits [sodium-ion] brings to the table are significant enough to overcome any stickiness” and hesitation around adopting new battery chemistries.
Chatter said that interest is now pouring in from all sides, citing inquiries from lithium-ion battery manufacturers, utilities, and defense companies and highlighting use cases ranging from data centers to apartment buildings and mining operations as likely early deployment targets.
A handful of startups are promising better, cheaper, safer water purification tech.
The need for desalination has long been clear in water-scarce regions of the planet. But with roughly a quarter of the global population now facing extreme water stress and drought conditions only projected to intensify, the technology is becoming an increasingly necessary tool for survival in a wider array of geographies.
Typically, scaling up desalination infrastructure has meant building costly, energy-intensive coastal plants that rely on a process called reverse osmosis, which involves pushing seawater through semi-permeable membranes that block salt and other contaminants, leaving only fresh water behind. Now, however, a number of startups are attempting to rework that model, with solutions that range from subsea facilities to portable desalination devices for individuals and families.
They could find potential customers across the globe. Many countries in the Middle East — including Saudi Arabia, Israel, Bahrain, Kuwait, and Qatar — rely on desalination for the bulk of their municipal water. Meanwhile, drought-prone regions from Australia to the Caribbean and California have also turned to the technology to shore up supply. But as the Iran war has underscored, this vital infrastructure is increasingly being treated as a military target, exposing a significant vulnerability in a resource relied upon by hundreds of millions.
One more resilient alternative is to move the plants underwater — making them more difficult to target while also harnessing subsurface pressure to do some of the energy-intensive work of desalination.
“I came up with the idea of using natural pressure to run the process,” Robert Bergstrom, a veteran of the water industry and CEO of the desalination startup OceanWell, told me. That meant “putting the membranes in a place where it’s already 800 pounds [of pressure] per square inch” — e.g. inside pods on the ocean floor, each capable of producing 1 million gallons of freshwater daily. By using the natural pressure of the ocean to drive the reverse osmosis process, this approach cuts energy use by about 40%, he said, thus slashing the system’s largest operating cost: electricity.
OceanWell’s design maintains a lower internal pressure within each pod than the surrounding environment, causing seawater to flow passively inside and push through membranes — just like on land, but without the high-pressure pumps. Compact pumps inside the pods then push the freshwater up a pipeline to the shore, while the resulting brine dissipates in the deep ocean.
The method also helps solve another problem with conventional desalination: environmental impact. Today’s facilities typically produce a more concentrated brine that they discharge at the ocean’s surface, which is more disruptive to marine ecosystems. The plants also frequently cause damage to organisms large and small by either trapping them against water intake screens or pulling them into the plant itself. That’s been a big sticking point when it comes to permitting these facilities, especially in California where the startup is based. OceanWell’s system, Bergstrom said, is able to filter out larger organisms while allowing microscopic ones to pass through the pods and return to the ocean.
The company began a trial last year in partnership with Las Virgenes Municipal Water District in southern California, testing its system in a freshwater reservoir full of marine life to verify its safety. Next it will test its pods in the ocean before undertaking a pilot in a to-be-determined location — California, Hawaii, and Nice in southern France are all contenders. If all goes according to plan, OceanWell will follow that up with a full-fledged commercial system targeted for 2030.
But it’s not the only startup pursuing underwater desalination — or even the one with the most aggressive timeline. Two years ago, Norwegian startup Flocean spun out of the subsea pump specialist FSubsea with a similar technical approach and a plan to deploy its first commercial system off Norway’s western coast this year. Flocean has already logged over a year of testing in the deep ocean, a stage OceanWell has yet to reach.
OceanWell thinks it can differentiate itself by meeting the unusually stringent permitting required in California. “If we can get it done in California, then the rest of the world will follow,” Bergstrom told me, meaning more resilient, more energy-efficient freshwater infrastructure for all. But it’s a high bar. The last major effort to build a desalination facility in the state led to a long-running fight that ended in 2022 with a rejection. Over 100 groups opposed the facility proposed for Orange County, citing risks to marine life, as well as high energy requirements and costs, with many arguing that alternatives — such as conservation and wastewater treatment — would be more superior options.
Megan Mauter, an associate professor of civil engineering at Stanford, thinks the groups may have a point, especially when it comes to overall system costs. The high capex of desalination can be hard to justify in California, she told me, since the state doesn’t need it 100% of the time, only in bad drought years. For example, just a few weeks ago, The Wall Street Journal reported that San Diego County’s desalination plant, by far the largest in California, now has a surplus of desalinated water that it’s looking to sell to drought-ridden Western states such as Nevada and Arizona.
And while desalination startups purport to cut overall system costs, she has her doubts about that. “The energy savings that they’re going to get are offset by some pretty high increased costs of the other elements of their plant designs,” Mauter told me. “In a subsea system, you’ve got these unproven and not mass-manufactured skids. You’ve got subsea installation, and then mooring it, and putting in pipelines that you’ve got to maintain all the way to land. You’ve got to convey water back to shore, which takes energy, and you are going to have significantly higher maintenance burdens in an open ocean environment.”
Despite her reservations, she certainly sees the appeal of non-traditional water sources, “even at costs that would have been totally infeasible a decade ago.” Municipal planners are staring down a future of worsening drought at the same time that states in the Colorado River basin remain locked in contentious negotiations over water rights, debating how to allocate cuts as river flows have declined nearly 20% since 2000. California’s narrow continental shelf also makes it an ideal environment for subsea desalination, as having deep water close to shore allows the system to harness pressure depths while minimizing the length of the pipeline needed to transport freshwater to land. Norway is also favored in this way.
“I don’t know whether the cost gaps can be solved, but I bet that the technology gaps could be solved,” Mauter told me.
Ultimately, she thinks the binding constraint is likely to be regulatory rather than technical. “Permitting is going to be a nightmare unless something fundamentally changes,” she said. Bergstrom told me that OceanWell is currently working with the California State Water Resources Control Board to revise its rules that govern desalination facilities in order to account for new technologies, though how long that process will take is anyone’s guess.
There’s one idea emerging in this ecosystem that largely sidesteps the regulatory constraints that control our land and seas. The startup Vital Lyfe has developed a portable desalination unit roughly the size of a small cooler that allows individuals and households to produce freshwater on demand with reverse osmosis — effectively decentralizing the desalination industry in the same way that the startup’s founders, former SpaceX engineers, helped decentralize internet infrastructure with Starlink.
“We’ve seen this paradigm shift coming out of Starlink that traditional, large, centralized, systems are very expensive,” Vital Lyfe CEO Jon Criss told me. “They’re hard to deploy and hard to scale up when you really need them.”
After raising a $24 million seed round in December, the startup launched its first product a few weeks ago, which retails for $750. At that price point, it’s a great deal for sailors spending days or weeks at sea, but likely too expensive for the individuals in remote communities far from water infrastructure that might need it most. Criss’s goal is to quickly iterate on this first product to bring more affordable models to the market in short order.
Portable desalination devices aren’t anything new in and of themselves — they’ve been used in military, maritime, and humanitarian scenarios for decades. The startup’s breakthrough, Criss explained, is more about manufacturing efficiency than technology. “We went all the way back, looked at why every component was designed and how to redesign it for high rate manufacturing. So we were able to substantially drop the cost of ownership and operation of these things.”
You’ll soon find Vital Lyfe’s product in big box retail stores, Criss said, though he also aims to partner with large-scale desalination facilities and utilities to help boost their output. Either way, the startup is already generating buzz — it’s seen significant inbound interest as of late, as the inherent resilience of its small system stands in sharp contrast to the vulnerability of conventional desalination infrastructure now being targeted in the Middle East.
The company is scaling up to meet the moment, building out a facility in Los Angeles county that Criss said will eventually produce 120 portable units per hour. He’s aiming to start production by summer’s end, ramping to full capacity by October. “Within the next three years we plan to account for about 10% of total membrane production at Vital Lyfe alone,” he told me, referring specifically to the production for the desalination industry.
The future of the industry, of course, could look like any combination of all of these approaches — portable devices, conventional plants on land, and modular systems at sea. What seems certain is that as the globe continues to heat up, so will desalination tech.
Why local governments are getting an earful about “infrasound”
As the data center boom pressures counties, cities, and towns into fights over noise, the trickiest tone local officials are starting to hear complaints about is one they can’t even hear – a low-frequency rumble known as infrasound.
Infrasound is a phenomenon best described as sounds so low, they’re inaudible. These are the sorts of vibrations and pressure at the heart of earthquakes and volcanic activity. Infrasound can be anything from the waves shot out from a sonic boom or an explosion to very minute changes in air pressure around HVAC systems or refrigerators.
Knowing some of these facilities also have the capacity to produce significant audible noise, growing segments of the population’s more tech-skeptical and health-anxious corners are fretting some data centers could be making a lot of infrasound, too. The whizzing of so many large computational machines combined with cooling fans and other large devices creating so many new columns of air flow. Add onto that any rotational onsite power generation – think natural gas turbines, for example – and you get quite a lot of movement that could potentially produce what they say is infrasound.
Some of the virality of this chatter about infrasound and data centers comes from a video about infrasound created by audio engineer and researcher Benn Jordan. Currently sitting at more than 1 million views, this short YouTube film documents claims that some data centers are operating like “acoustic weapons” through infrasound and harming people. Andy Masley, an “effective altruist” writer, has become the chief critic of the Jordan video, getting into a back-and-forth that’s raised the issue to Internet discourse territory.
The Jordan-Masley infrasound debate is honestly a bit of a mess. So I want to be clear: I’m not going to get into the science of whether or not infrasound poses any kind of public health risk in this article. We can get to that later. It’s worth saying that this subject may need more study and that work is ongoing. Also, talking about infrasound at all can make you honestly sound a little wacky (see: this study blaming people seeing ghosts on infrasound). It might also remind you of another panic in the Electric Age: electromagnetic fields, also known as EMFs. Developers of transmission lines and solar projects have long had to deal with people worried about transmission lines and large electrical equipment potentially glowing with invisible, unhealthy radiation.
In late 2024, I wrote about how an RFK Jr. supporter worried about this form of electrical emission was helping lead the fight against a transmission line in New Jersey for offshore wind. Maybe that’s why it didn’t surprise me one bit when the Health and Human Services secretary himself told a U.S. Senate Committee last week that he was asking the Surgeon General’s office to “do either meta reviews” or “base studies” on noise pollution and EMF radiation from data centers “so we can better inform the American public.”
“There’s a range of injuries that are very, very well documented. They’re neurological – very, very grave neurological injuries, cancer risk,” Kennedy Jr. told the Senate Health, Education, Labor and Pensions Committee on April 22 in response to a request from Sen. Josh Hawley of Missouri to study the issue. “The risks, to me, are tremendous.”
There’s also the unfortunate reality that infrasound impacts have previously been a cudgel to slow down renewable energy deployment. Wind turbines create infrasound because of the subharmonic frequencies created when one turbine rotates at a slightly different pace than another, producing a slightly dissonant low frequency noise. Groups like the Heartland Institute proudly list this infrasound as one of the reasons wind energy “menaces man and nature.”
But regardless of merit, this concern is already impacting local government decisions around data center projects, much like how one Michigan county sought to restrict solar energy on the same basis.
In February Adrian Shelley, the Texas director for environmental group Public Citizen, implored the city of Red Rock to study changing their noise ordinance to take into account infrasound. “It has effects on sleep patterns, on stress, on cardiovascular health, and it is potentially a very serious concern,” Shelley said at a February 11 city council discussion on data center rules. “It will not be covered by the city’s noise ordinance, which only deals with audible sound.”
Earlier this month in Calvert County, Maryland, a volunteer for their environmental commission recently told the county government that infrasound needs to be factored into their future data center planning. “It will have significant impacts on our region and the Chesapeake and the Patuxent because infrasound isn’t stopped by walls,” commission member Janette Wysocki, a proud land conservationist, said at an April 15 hearing. “It will keep going, it will move through anything. It’s a very long wavelength. So we need to protect our ecosystem.” Wysocki implored the county to consider whether to adjust its noise regulations.
Around the same time, similar concerns were raised in Lebanon, a small city in east-central Pennsylvania. “It permeates through concrete walls, it permeates through the ground,” Thomas Dompier, an associate professor at Lebanon Valley College, said at an April 16 Lebanon County commission hearing on data centers.
Lastly, last week I explained how Loudon County wants to rethink its noise ordinance to deal with low-frequency “hums” from data centers – a concern echoing those who fret infrasound.
Ethan Bourdeau, executive director of standards at Quiet Parks Intentional and a career acoustician and building standards writer, told me that what makes data centers unique is the “constant drone” of noise that could potentially carry subharmonic frequencies. Bourdeau said cities or counties could possibly factor concerns about infrasound into noise ordinances to address those who are most concerned. One way they could do it is by changing how decibels are weighted in the government’s measurements. A-weighting decibel meters are a common form of sound measurement geared toward perceptible noise. Using different systems, like C-weighting or G-weighting, would avoid ways that A-weighting can filter out sub-hearing frequencies.
“These are reporting and weighting systems where a sound level meter taking background noise receives all the unweighted sound and then you apply all these filters afterwards, like an EQ curve,” Bourdeau said.
So I guess if those most concerned about infrasound have their way, a lot of country commissioners and local elected leaders will be heading to the mixing booth.