You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
We didn’t know it was coming. We didn’t know where it came from. We still can’t nail its climate change connection.

What happened?
No, seriously — what happened?
Last week, the American megalopolis, that string of jewels on the old Atlantic coast, found itself shrouded by wildfire smoke. New York City’s air turned ashen and dun, then glowed a supernatural amber. For the first time in who-knows, sightseers standing at the U.S. Capitol Building could not see the Washington Monument, a mile and change down the Mall.
You could list the sports games canceled or flights delayed, but what was oddest about the event was the sheer ubiquity of it. This was one of the few news stories I can remember where you could look up from whatever article you were reading and see the story itself, softly lapping at your window.
And then it was gone. By Friday, the haze had blown out to sea.
It was, in retrospect, a strange time — deeply strange, humblingly strange, strange before almost any other quality. More than 128 million Americans were under an air-quality alert on Wednesday night — roughly the population of Germany and Spain combined — but scarcely 36 hours earlier, nobody had known to prepare for anything worse than a moderate haze. The country’s biggest wildfire-pollution event on record arrived essentially out of nowhere.
Get one great climate story in your inbox every day:
One of the main modes of journalism today — but also, frankly, one of the main frames of our whole cultural apparatus, from TikToks that lightly gloss Wikipedia articles to rampant right-wing conspiracism — is that of the explanation. Everyone explains what’s happening to each other, as it happens, at all times, and therefore makes the world seem rational, empirical, and less frightening. Yet with the wildfire smoke, what is so striking is how little we understood. Nearly every important step in the story was misapprehended as it happened.
We didn’t know that the smoke was coming, for instance. On Tuesday morning, meteorologists predicted that the same moderate haze that has hung around all season would again hit the East Coast. The New York Department of Environmental Conservation put out an alert saying that the air quality index, or AQI, might rise to 150 across the state.
They did not forecast — nobody, as far as I can tell, did — that the worst air pollution in decades would soon wallop the state. By Tuesday night, New York City’s AQI had already reached 174, according to Environmental Protection Agency data. As I walked home in D.C., I could see tendrils of visible smoke hugging the upper stories of apartment buildings.
This prediction failure gave the ensuing response a halting, confused quality. How could such a massive event come out of nowhere? Not until Thursday afternoon — when the smoke had nearly passed — did the federal government advise its workers that they could telework or take vacation time to avoid the bad air. On Friday, New York closed in-person schools, just in time for blue sky to return.
I find it hard to blame them. This was an unprecedented event in part because the fires were so far from where they affected. Everyone had seen the videos of smoke besieging Portland and San Francisco in 2020, but back then, the fires had been near those cities — a couple hundred miles away at most. Where was the smoke coming from now? The Adirondacks were fine. Vermont was’t burning.
Here, we misunderstood again. Many outlets — including this one, at first — initially reported that the smoke came from Nova Scotia, where large and destructive fires had raged the week before. But those fires had been doused over the weekend by some of the same weather pattern that was now ferrying smoke to us. In fact, the smoke had come from the boreal forests of northern Quebec, more than 500 miles from New York City.
Why were these fires raging? Not even Canadians could give a good answer. With fires in Alberta and Nova Scotia gobbling attention and resources, the Quebec fires had seemingly been an afterthought until their smoke blew into Toronto and Ottawa, which happened only a few hours before it arrived in New York. Suddenly, a secondary event had become the main event.
On Wednesday, I talked to a Canadian climatologist who seemed hazy about why Quebec was burning in the first place. “I think this situation is kind of similar to Nova Scotia,” he told me, blaming that province’s warm, dry spring for the blazes. But this explanation — which appeared in many outlets — was only somewhat true: While Quebec had suffered a warm May, it was not in drought.
We did not understand why these fires are burning — and honestly, we still don’t. President Joe Biden said that the smoke provided “another stark reminder of the impacts of climate change.” I am not so sure. There’s no doubt, to be clear, that climate change will make wildfires worse across North America: The Intergovernmental Panel on Climate Change says that hot, dry “fire weather” will increase throughout the 21st century. But, again, Quebec is not in drought. As for today, no climate-change signal has appeared in eastern Canadian wildfire data. Their connection to climate change is far less clear cut than it is in, say, California’s blazes.
Yet neither would I condescend to someone who does blame climate change here. When something like this happens, how can you not cite the planet’s biggest ongoing physical transformation? If climate change makes flukey weather more likely, shouldn’t we at least consider it being responsible for some of the flukiest weather in decades? The thing about unprecedented events is that you lack precedent for them.
Not that we completely lack an example for this. In 1780, the sun was blotted out across New England. Nocturnal animals came out; people fretted in the streets and abandoned their work; the Connecticut state legislature considered adjourning for doomsday. Not until a decade ago did we finally learn that the “dark day” was caused by Canadian wildfire smoke drifting south.
Which suggests that this might be a once-in-250-year event. But maybe it’s not any more. Maybe with climate change, it’s a once-a-century event. Or a once-in-a-decade event. For now, the sample size is two.
So I wonder: If it wasn’t climate change, would it matter? The pandemic has already taught us that indoor air quality matters, that unseen particles floating in the air can do serious harm. No matter what happens with the climate, Canada is too large and unpopulated to fight every wildfire; neither can it manage the same kind of labor-intensive forest management that California might attempt. East Coasters should come away from our own dark days with new compassion for people out West — and those across the world — who must deal with wildfire smoke on a seasonal basis, not to mention the fires themselves. Regardless of climate change’s role in this fire, it makes wildfires more likely: We should continue to try to decarbonize as fast as we can.
But as for local policy, perhaps our aims should be humbler. We now know (again) that a great cloud of wildfire smoke can blow up on the East Coast at any moment and poison our air. We don’t need to know everything to protect ourselves and our neighbors from that. Air filters cost hundreds of dollars, but not thousands; in new multi-family buildings, they are built into the ventilation system itself. Perhaps the right lesson from this outbreak should be to change our expectations, and think of indoor air filtering like brushing your teeth — a habit essential to our hygiene, to be used by all, and to be provisioned for those who cannot afford one at the public expense.
Maybe that’s prudent climate adaptation. Or maybe — in the wake of COVID, Canadian smoke, and who-knows-what-comes-next — it’s just new common sense.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Plus news on cloud seeding, fission for fusion, and more of the week’s biggest money moves.
From beaming solar power down from space to shooting storm clouds full of particles to make it rain, this week featured progress across a range of seemingly sci-fi technologies that have actually been researched — and in some cases deployed — for decades. There were, however, few actual funding announcements to speak of, as earlier-stage climate tech venture funds continue to confront a tough fundraising environment.
First up, I explore Meta’s bet on space-based solar as a way to squeeze more output from existing solar arrays to power data centers. Then there’s the fusion startup Zap Energy, which is shifting its near-term attention toward the more established fission sector. Meanwhile, a weather modification company says it’s found a way to quantify the impact of cloud seeding — a space-age sounding practice that’s actually been in use for roughly 80 years. And amidst a string of disappointments for alternate battery chemistries, this week brings multiple wins for the sodium-ion battery sector.
One might presume that terrestrial solar paired with batteries would prove perfectly adequate for securing 24/7 clean energy moving forward, as global prices for panels and battery packs continue to fall. But the startup Overview Energy, which uses lasers to beam solar power from space directly onto existing solar arrays, thinks its space-based solar energy systems will prove valuable for powering large loads like data centers through the night. Now Meta is backing that premise, signing a first-of-its-kind agreement with Overview this week that secures early access for up to a gigawatt of capacity from the startup’s system.
Initial orbital demonstrations are slated for 2028, with commercial power delivery targeted for 2030. It’s an ambitious timeline, and certainly not the first effort to commercialize space-based solar, though prior analyses have generally concluded that while the physics check out, the economics and logistics don’t. Overview Energy thinks its found the core unlocks though: “geographic untethering,” which allows it to direct its beam to ground-based solar arrays anywhere in the world based on demand, and high-efficiency lasers capable of converting near-infrared light into electricity much more efficiently than pure sunlight.
The startup is targeting between $60 and $100 per megawatt-hour by 2035, at which point the goal is to be putting gigawatts of space solar on the grid. “It’s 5 o’clock somewhere,” Marc Berte, founder and CEO of Overview Energy, told me when I interviewed him last December. “You’re profitable at $100 bucks a megawatt-hour somewhere, instantaneously, all the time.”
Launch costs have also fallen sharply since the last serious wave of space-solar research, and Overview has already booked a 2028 launch with SpaceX. Solar power beamed from space also sidesteps two earthly constraints — land use and protracted grid interconnection timelines. So while this seemingly sci-fi vision remains unproven, it might be significantly more plausible than it once appeared. And Meta’s certainly not alone in taking that bet — Overview has already raised a $20 million seed round led by Lowercarbon Capital, Prime Movers Lab, and Engine Ventures.
Fusion startups are increasingly looking to nearer-term revenue opportunities as they work toward commercializing the Holy Grail of energy generation. Industry leader Commonwealth Fusion Systems is selling its high-temperature superconducting magnets to other developers, while other companies including Shine Technologies are generating income by producing nuclear isotopes for medical imaging. Now one startup, Zap Energy, is pushing that playbook a step further, announcing this week that it plans to develop fission reactors before putting its first fusion electrons on the grid.
Specifically, the startup is now attempting to develop small modular reactors — hardly a novel idea, as companies like Oklo, Kairos, and TerraPower have already secured significant public and private funding and struck major data center deals. Zap, however, thinks it can catch up to these new competitors in part by leveraging design commonalities between fission and fusion systems, including the use of liquid metals, engineered neutron environments, and high-power-density systems. “Fission and fusion are two expressions of the same underlying physics," Zap’s co-founder Benj Conwayby said in the press release. "This isn’t a pivot — by integrating them into a single platform, we can move faster, reduce risk, and build a more enduring company."
As the company outlines on its website, pursuing both pathways could eventually manifest in the development of a hybrid fusion-fission system, while also giving Zap practical experience interfacing with regulators and securing approvals. As The New York Times reports, the company is targeting an early 2030s timeline for its fission reactors, although Zap has yet to specify a timeline for fusion commercialization. Like so many of its peers, the company is eyeing data centers as a promising initial market, though bringing its first units online will likely require a significant influx of additional capital.
For all the concern surrounding geoengineering fixes for climate change such as solar radiation management, there’s one form of weather modification that’s been in use since the 1940s — cloud seeding. This practice typically involves flying planes into the center of storms and releasing flares that disperse a chemical called silver iodide into the clouds. This causes the water droplets within the clouds to freeze, increasing the amount of precipitation that falls as either rain or snow.
Alarming as it may sound for the uninitiated, there’s no evidence that silver iodide causes harm at current usage levels. But what has been far more difficult to pin down is efficacy — specifically, how much additional precipitation cloud seeding actually creates. That’s where the startup Rainmaker comes in. The company, which deploys unmanned drones to inject the silver iodide, says that its advanced radar and satellite systems indicate that its operations generated over 143 million gallons of additional freshwater in Oregon and Utah this year — roughly equivalent to the annual water usage of about 1,750 U.S. households. The findings have not yet been peer reviewed, but if accurate, they would make Rainmaker the first private company to quantify the impact of its cloud seeding operations.
Cloud seeding is already a well-oiled commercial business, with dozens of states, utility companies and ski resorts alike using it to increase snowfall in the drought-stricken American West and worldwide — China in particular spends tens of millions of dollars per year on the technology. Rainmaker has a particular aspiration: to help restore Utah’s Great Salt Lake, which has been shrinking since the 1980s amid rising water demand and increased evaporation driven by warmer temperatures.
In a press release, the company’s 26-year-old founder and CEO Augustus Doricko said, “With the newfound capability to measure our yields and quantify our results, Rainmaker will go forward and continue our mission to refill the Great Salt Lake, end drought in the American West and deliver water abundance wherever it is needed most around the world."
Sodium-ion batteries have long been touted as an enticing alternative — or at least complement — to lithium-ion systems for energy storage. They don’t rely on scarce and costly critical minerals like lithium, nickel, or cobalt, and have the potential to be far less flammable. The relatively nascent market also offers an opening for the U.S. to gain a foothold in this segment of the battery supply chain. But especially domestically, the industry has struggled to gain traction. Two sodium-ion startups, Natron and Bedrock Materials, both closed up shop last year as prices for lithium-iron-phosphate batteries cratered, eroding sodium-ion’s cost advantage, while the cost of manufacturing batteries in the U.S. constrained their ability to scale.
But one notable bright spot is the startup Alsym Energy, which announced this week that it has signed a letter-of-intent with long-duration energy storage company ESS Inc. for 8.5 gigawatt-hours of sodium-ion cells and modules, marking ESS’s expansion into the short and medium-duration storage market. Alsym’s CEO, Mukesh Chatter, told me this represents the largest deal for sodium-ion batteries in the U.S. to date — although it’s not yet a binding contract. Notably, it came just a day after the world’s largest-ever order for these batteries, as CATL disclosed a 60 gigawatt-hour sodium-ion agreement with energy storage integrator HyperStrong. Taken together, these partnerships suggest the sector is finally picking up durable traction both domestically and abroad.
ESS, however, is facing its own operational headwinds, nearly shuttering its Oregon manufacturing plant last year before securing an unexpected cash infusion and pivoting to a new, longer-duration storage product. Chatter remains exuberant about Alsym’s deal with the storage provider, however, telling me it represents a major proof point in terms of broader industry acceptance and an acknowledgement that “the benefits [sodium-ion] brings to the table are significant enough to overcome any stickiness” and hesitation around adopting new battery chemistries.
Chatter said that interest is now pouring in from all sides, citing inquiries from lithium-ion battery manufacturers, utilities, and defense companies and highlighting use cases ranging from data centers to apartment buildings and mining operations as likely early deployment targets.
A handful of startups are promising better, cheaper, safer water purification tech.
The need for desalination has long been clear in water-scarce regions of the planet. But with roughly a quarter of the global population now facing extreme water stress and drought conditions only projected to intensify, the technology is becoming an increasingly necessary tool for survival in a wider array of geographies.
Typically, scaling up desalination infrastructure has meant building costly, energy-intensive coastal plants that rely on a process called reverse osmosis, which involves pushing seawater through semi-permeable membranes that block salt and other contaminants, leaving only fresh water behind. Now, however, a number of startups are attempting to rework that model, with solutions that range from subsea facilities to portable desalination devices for individuals and families.
They could find potential customers across the globe. Many countries in the Middle East — including Saudi Arabia, Israel, Bahrain, Kuwait, and Qatar — rely on desalination for the bulk of their municipal water. Meanwhile, drought-prone regions from Australia to the Caribbean and California have also turned to the technology to shore up supply. But as the Iran war has underscored, this vital infrastructure is increasingly being treated as a military target, exposing a significant vulnerability in a resource relied upon by hundreds of millions.
One more resilient alternative is to move the plants underwater — making them more difficult to target while also harnessing subsurface pressure to do some of the energy-intensive work of desalination.
“I came up with the idea of using natural pressure to run the process,” Robert Bergstrom, a veteran of the water industry and CEO of the desalination startup OceanWell, told me. That meant “putting the membranes in a place where it’s already 800 pounds [of pressure] per square inch” — e.g. inside pods on the ocean floor, each capable of producing 1 million gallons of freshwater daily. By using the natural pressure of the ocean to drive the reverse osmosis process, this approach cuts energy use by about 40%, he said, thus slashing the system’s largest operating cost: electricity.
OceanWell’s design maintains a lower internal pressure within each pod than the surrounding environment, causing seawater to flow passively inside and push through membranes — just like on land, but without the high-pressure pumps. Compact pumps inside the pods then push the freshwater up a pipeline to the shore, while the resulting brine dissipates in the deep ocean.
The method also helps solve another problem with conventional desalination: environmental impact. Today’s facilities typically produce a more concentrated brine that they discharge at the ocean’s surface, which is more disruptive to marine ecosystems. The plants also frequently cause damage to organisms large and small by either trapping them against water intake screens or pulling them into the plant itself. That’s been a big sticking point when it comes to permitting these facilities, especially in California where the startup is based. OceanWell’s system, Bergstrom said, is able to filter out larger organisms while allowing microscopic ones to pass through the pods and return to the ocean.
The company began a trial last year in partnership with Las Virgenes Municipal Water District in southern California, testing its system in a freshwater reservoir full of marine life to verify its safety. Next it will test its pods in the ocean before undertaking a pilot in a to-be-determined location — California, Hawaii, and Nice in southern France are all contenders. If all goes according to plan, OceanWell will follow that up with a full-fledged commercial system targeted for 2030.
But it’s not the only startup pursuing underwater desalination — or even the one with the most aggressive timeline. Two years ago, Norwegian startup Flocean spun out of the subsea pump specialist FSubsea with a similar technical approach and a plan to deploy its first commercial system off Norway’s western coast this year. Flocean has already logged over a year of testing in the deep ocean, a stage OceanWell has yet to reach.
OceanWell thinks it can differentiate itself by meeting the unusually stringent permitting required in California. “If we can get it done in California, then the rest of the world will follow,” Bergstrom told me, meaning more resilient, more energy-efficient freshwater infrastructure for all. But it’s a high bar. The last major effort to build a desalination facility in the state led to a long-running fight that ended in 2022 with a rejection. Over 100 groups opposed the facility proposed for Orange County, citing risks to marine life, as well as high energy requirements and costs, with many arguing that alternatives — such as conservation and wastewater treatment — would be more superior options.
Megan Mauter, an associate professor of civil engineering at Stanford, thinks the groups may have a point, especially when it comes to overall system costs. The high capex of desalination can be hard to justify in California, she told me, since the state doesn’t need it 100% of the time, only in bad drought years. For example, just a few weeks ago, The Wall Street Journal reported that San Diego County’s desalination plant, by far the largest in California, now has a surplus of desalinated water that it’s looking to sell to drought-ridden Western states such as Nevada and Arizona.
And while desalination startups purport to cut overall system costs, she has her doubts about that. “The energy savings that they’re going to get are offset by some pretty high increased costs of the other elements of their plant designs,” Mauter told me. “In a subsea system, you’ve got these unproven and not mass-manufactured skids. You’ve got subsea installation, and then mooring it, and putting in pipelines that you’ve got to maintain all the way to land. You’ve got to convey water back to shore, which takes energy, and you are going to have significantly higher maintenance burdens in an open ocean environment.”
Despite her reservations, she certainly sees the appeal of non-traditional water sources, “even at costs that would have been totally infeasible a decade ago.” Municipal planners are staring down a future of worsening drought at the same time that states in the Colorado River basin remain locked in contentious negotiations over water rights, debating how to allocate cuts as river flows have declined nearly 20% since 2000. California’s narrow continental shelf also makes it an ideal environment for subsea desalination, as having deep water close to shore allows the system to harness pressure depths while minimizing the length of the pipeline needed to transport freshwater to land. Norway is also favored in this way.
“I don’t know whether the cost gaps can be solved, but I bet that the technology gaps could be solved,” Mauter told me.
Ultimately, she thinks the binding constraint is likely to be regulatory rather than technical. “Permitting is going to be a nightmare unless something fundamentally changes,” she said. Bergstrom told me that OceanWell is currently working with the California State Water Resources Control Board to revise its rules that govern desalination facilities in order to account for new technologies, though how long that process will take is anyone’s guess.
There’s one idea emerging in this ecosystem that largely sidesteps the regulatory constraints that control our land and seas. The startup Vital Lyfe has developed a portable desalination unit roughly the size of a small cooler that allows individuals and households to produce freshwater on demand with reverse osmosis — effectively decentralizing the desalination industry in the same way that the startup’s founders, former SpaceX engineers, helped decentralize internet infrastructure with Starlink.
“We’ve seen this paradigm shift coming out of Starlink that traditional, large, centralized, systems are very expensive,” Vital Lyfe CEO Jon Criss told me. “They’re hard to deploy and hard to scale up when you really need them.”
After raising a $24 million seed round in December, the startup launched its first product a few weeks ago, which retails for $750. At that price point, it’s a great deal for sailors spending days or weeks at sea, but likely too expensive for the individuals in remote communities far from water infrastructure that might need it most. Criss’s goal is to quickly iterate on this first product to bring more affordable models to the market in short order.
Portable desalination devices aren’t anything new in and of themselves — they’ve been used in military, maritime, and humanitarian scenarios for decades. The startup’s breakthrough, Criss explained, is more about manufacturing efficiency than technology. “We went all the way back, looked at why every component was designed and how to redesign it for high rate manufacturing. So we were able to substantially drop the cost of ownership and operation of these things.”
You’ll soon find Vital Lyfe’s product in big box retail stores, Criss said, though he also aims to partner with large-scale desalination facilities and utilities to help boost their output. Either way, the startup is already generating buzz — it’s seen significant inbound interest as of late, as the inherent resilience of its small system stands in sharp contrast to the vulnerability of conventional desalination infrastructure now being targeted in the Middle East.
The company is scaling up to meet the moment, building out a facility in Los Angeles county that Criss said will eventually produce 120 portable units per hour. He’s aiming to start production by summer’s end, ramping to full capacity by October. “Within the next three years we plan to account for about 10% of total membrane production at Vital Lyfe alone,” he told me, referring specifically to the production for the desalination industry.
The future of the industry, of course, could look like any combination of all of these approaches — portable devices, conventional plants on land, and modular systems at sea. What seems certain is that as the globe continues to heat up, so will desalination tech.
Why local governments are getting an earful about “infrasound”
As the data center boom pressures counties, cities, and towns into fights over noise, the trickiest tone local officials are starting to hear complaints about is one they can’t even hear – a low-frequency rumble known as infrasound.
Infrasound is a phenomenon best described as sounds so low, they’re inaudible. These are the sorts of vibrations and pressure at the heart of earthquakes and volcanic activity. Infrasound can be anything from the waves shot out from a sonic boom or an explosion to very minute changes in air pressure around HVAC systems or refrigerators.
Knowing some of these facilities also have the capacity to produce significant audible noise, growing segments of the population’s more tech-skeptical and health-anxious corners are fretting some data centers could be making a lot of infrasound, too. The whizzing of so many large computational machines combined with cooling fans and other large devices creating so many new columns of air flow. Add onto that any rotational onsite power generation – think natural gas turbines, for example – and you get quite a lot of movement that could potentially produce what they say is infrasound.
Some of the virality of this chatter about infrasound and data centers comes from a video about infrasound created by audio engineer and researcher Benn Jordan. Currently sitting at more than 1 million views, this short YouTube film documents claims that some data centers are operating like “acoustic weapons” through infrasound and harming people. Andy Masley, an “effective altruist” writer, has become the chief critic of the Jordan video, getting into a back-and-forth that’s raised the issue to Internet discourse territory.
The Jordan-Masley infrasound debate is honestly a bit of a mess. So I want to be clear: I’m not going to get into the science of whether or not infrasound poses any kind of public health risk in this article. We can get to that later. It’s worth saying that this subject may need more study and that work is ongoing. Also, talking about infrasound at all can make you honestly sound a little wacky (see: this study blaming people seeing ghosts on infrasound). It might also remind you of another panic in the Electric Age: electromagnetic fields, also known as EMFs. Developers of transmission lines and solar projects have long had to deal with people worried about transmission lines and large electrical equipment potentially glowing with invisible, unhealthy radiation.
In late 2024, I wrote about how an RFK Jr. supporter worried about this form of electrical emission was helping lead the fight against a transmission line in New Jersey for offshore wind. Maybe that’s why it didn’t surprise me one bit when the Health and Human Services secretary himself told a U.S. Senate Committee last week that he was asking the Surgeon General’s office to “do either meta reviews” or “base studies” on noise pollution and EMF radiation from data centers “so we can better inform the American public.”
“There’s a range of injuries that are very, very well documented. They’re neurological – very, very grave neurological injuries, cancer risk,” Kennedy Jr. told the Senate Health, Education, Labor and Pensions Committee on April 22 in response to a request from Sen. Josh Hawley of Missouri to study the issue. “The risks, to me, are tremendous.”
There’s also the unfortunate reality that infrasound impacts have previously been a cudgel to slow down renewable energy deployment. Wind turbines create infrasound because of the subharmonic frequencies created when one turbine rotates at a slightly different pace than another, producing a slightly dissonant low frequency noise. Groups like the Heartland Institute proudly list this infrasound as one of the reasons wind energy “menaces man and nature.”
But regardless of merit, this concern is already impacting local government decisions around data center projects, much like how one Michigan county sought to restrict solar energy on the same basis.
In February Adrian Shelley, the Texas director for environmental group Public Citizen, implored the city of Red Rock to study changing their noise ordinance to take into account infrasound. “It has effects on sleep patterns, on stress, on cardiovascular health, and it is potentially a very serious concern,” Shelley said at a February 11 city council discussion on data center rules. “It will not be covered by the city’s noise ordinance, which only deals with audible sound.”
Earlier this month in Calvert County, Maryland, a volunteer for their environmental commission recently told the county government that infrasound needs to be factored into their future data center planning. “It will have significant impacts on our region and the Chesapeake and the Patuxent because infrasound isn’t stopped by walls,” commission member Janette Wysocki, a proud land conservationist, said at an April 15 hearing. “It will keep going, it will move through anything. It’s a very long wavelength. So we need to protect our ecosystem.” Wysocki implored the county to consider whether to adjust its noise regulations.
Around the same time, similar concerns were raised in Lebanon, a small city in east-central Pennsylvania. “It permeates through concrete walls, it permeates through the ground,” Thomas Dompier, an associate professor at Lebanon Valley College, said at an April 16 Lebanon County commission hearing on data centers.
Lastly, last week I explained how Loudon County wants to rethink its noise ordinance to deal with low-frequency “hums” from data centers – a concern echoing those who fret infrasound.
Ethan Bourdeau, executive director of standards at Quiet Parks Intentional and a career acoustician and building standards writer, told me that what makes data centers unique is the “constant drone” of noise that could potentially carry subharmonic frequencies. Bourdeau said cities or counties could possibly factor concerns about infrasound into noise ordinances to address those who are most concerned. One way they could do it is by changing how decibels are weighted in the government’s measurements. A-weighting decibel meters are a common form of sound measurement geared toward perceptible noise. Using different systems, like C-weighting or G-weighting, would avoid ways that A-weighting can filter out sub-hearing frequencies.
“These are reporting and weighting systems where a sound level meter taking background noise receives all the unweighted sound and then you apply all these filters afterwards, like an EQ curve,” Bourdeau said.
So I guess if those most concerned about infrasound have their way, a lot of country commissioners and local elected leaders will be heading to the mixing booth.