You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
A desire to please the Court may have rendered the EPA’s new power plant rule a little too ineffectual.

If nothing else, give the Environmental Protection Agency credit for this: They seem to understand the assignment.
Last year, the Supreme Court struck down the Clean Power Plan, President Barack Obama’s ambitious attempt to restrict carbon pollution from power plants. That proposal never carried the force of law, and it had been held in suspended animation by the Court — and later the Trump administration — since 2016. But after President Joe Biden took office, Chief Justice John Roberts and the Court’s conservative majority revived it seemingly entirely for the sake of deeming it illegal.
The proposal went far beyond what was allowed by Congress, Roberts ruled. Normally, an EPA standard would require that power plants or factories install some kind of equipment on their smoke stacks to meet a pollution cap. “By contrast, and by design,” the Obama proposal could only be satisfied by burning less coal, the chief justice wrote. It required “generation shifting,” forcing states to get more of their power from renewable, nuclear, or natural-gas plants.
That overreached the EPA’s authority under the Clean Air Act, Roberts declared. If the EPA wanted to regulate greenhouse gases, then it needed to treat them like a normal air pollutant — and it needed to act like a normal technocratic agency. Above all, it had to keep its regulations to those that could be accomplished “inside the fenceline” of each power plant.
So last week, when the Biden administration finally unveiled its own draft attempt at regulating carbon pollution from power plants, it knew it was playing on the Court’s, well, court. And it behaved accordingly. The best thing you can say about the EPA’s new power-plant proposal — which will be one of the Biden era’s most important climate regulations — is that it was meticulously, painstakingly tailored to the Court’s demands. If Chief Justice Roberts asked for a normal rule, then the EPA has delivered one so awkwardly, self-consciously normal that it seems a little like a narc. The worst thing about the new rule is that this desire to please the Court may have rendered the rule a little too ineffectual.
If America wants to fight climate change, it must clean up its power plants. Generating abundant, cheap, zero-carbon electricity is the key to the country’s decarbonization strategy.
“If you clean up the power sector, it enables you to clean up other sectors of the economy too, through electrification,” Leah Stokes, an environmental-science professor at the University of California, Santa Barbara, told me. “Electric cars, heat pumps, induction stoves — all these machines can be fueled with clean power.”
Biden’s climate law, the Inflation Reduction Act, will slash emissions from the sector over the next decade, according to federal and independent modeling efforts. But it won’t get the sector all the way there. That’s where the new proposal is supposed to step in.
As per the Supreme Court’s request, the proposal details how every kind of power plant — even those that burn coal or natural gas — can meet their climate requirements for decades to come. It mandates a buildout of carbon capture and storage infrastructure, or CCS, for most coal and some natural-gas plants that plan to stay open long-term.
“The EPA rule makes sure everyone is on the same level-playing field. If the Inflation Reduction Act is enough to incentivize CCS in some places, the EPA is gonna make sure everyone is gonna do it,” Nick Bryner, a law professor at Louisiana State University, told me. “I think it’s designed very, very well to work in tandem with the IRA tax credits.”
If the IRA is the regulatory-friendly angel on its shoulder, then the Supreme Court’s decision last year — called West Virginia v. EPA — is the devil. The EPA’s desire to stay on the Court’s good side is even visible in the proposal’s name. Previous administrations have tried to give their power-plant rules a memorable name — Obama had the Clean Power Plan, of course, and the Trump administration christened its effort the “Affordable Clean Energy Rule,” or ACE. The Biden administration, by comparison, named the new proposal:
New Source Performance Standards for Greenhouse Gas Emissions from New, Modified, and Reconstructed Fossil Fuel-Fired Electric Generating Units; Emission Guidelines for Greenhouse Gas Emissions from Existing Fossil Fuel-Fired Electric Generating Units; and Repeal of the Affordable Clean Energy Rule
That’s the NSPSGHGNMRFFFEGU; EGGGEEFFFGU; RACE Rule for short.
I would say that the agency couldn’t have given it a more technocratic name if it tried, except that it obviously tried very hard. “Traditional approach, traditional name,” the EPA’s press office chirped when the Politico reporter Alex Guillén first noted the name. Just what the Supreme Court asked for!, they all but added. The agency is so desperate to look obedient and demure that even its social-media team has been briefed on current federal doctrine.
At the same time, the rule does “a tremendous amount to make the rule as flexible as possible given the constraints they’re working with in West Virginia v. EPA,” Bryner said. Under the proposal, some natural-gas plants can choose between installing carbon-capture equipment or burning low-carbon hydrogen.
But the rules may have erred on the side of too much flexibility, says Charles Harper, a policy analyst at Evergreen, a climate advocacy group and think tank. Evergreen and other environmental groups are worried that the rules might be too generous to fossil fuel companies. They’re focusing their criticism on two elements of the draft: its handling of natural-gas plants and coal retirements.
First, the EPA rule as proposed would not apply to an overwhelming majority of the country’s natural-gas plants.
A large share of carbon emissions from natural-gas plants come from so-called “baseload” plants that generate many hundreds of megawatts of electricity at all hours of the day. The rule focuses on these facilities, and it requires them either to install CCS equipment or to burn hydrogen fuel.
But the rule is not nearly so strict about small or medium-sized natural-gas plants. Natural-gas plants that generate less than 300 megawatts of electricity — or that run less than half the time — are essentially exempt from the rule. This excludes 77% of the country’s natural-gas plants from the new EPA proposal, requiring them to make no changes through 2040.
It is unclear what share of carbon emissions these natural-gas plants represent. The EPA did not provide an estimate of their carbon emissions before the deadline for this story.
As a whole, natural-gas power plants emit 43% of the U.S. electricity sector’s carbon pollution, despite producing nearly twice as much power as coal.
Environmental groups say the proposal’s coal problem is simpler to fix. In the draft, the EPA puts coal-fired power plants in different categories depending on when they’re slated to retire. Plants that have no retirement date — or that will remain open after 2040 — must install equipment to capture 90% of their emissions by the year 2030. Plants shutting down after 2035 must make a cheaper set of changes. And plants due to close by 2032 don’t have to make any changes at all, so long as they don’t increase their emissions over the next decade.
Those deadlines are too long from now, and the EPA should bring them forward in time when it issues a final version of the rule, Harper said. “2040 is pretty far out and would entail a lot of unabated emissions hitting the climate and human health,” he told me.
The EPA still has time to edit this proposal; it will hear public comment over the next few months and probably issue a final version of the rule next year. With the procedural issues resolved, the Supreme Court’s ability to object to that rule is limited to whether carbon capture is feasible and affordable enough to be used under the Clean Air Act.
If there is a bright spot for climate advocates in the new rule, it’s that the Biden administration — and last year’s Democratic majority in Congress — seem to have anticipated that move.
As the House was voting on the IRA last year, Representative Frank Pallone, the chair of the House Energy and Commerce Committee, put a statement in the congressional record saying that the EPA should take the IRA’s generous tax credits into account when proposing power-plant rules. The subsidies should be considered when the agency is deciding whether CCS is feasible and affordable, he said. The EPA cites Pallone’s statement in its new draft.
But ultimately it is Chief Justice John Roberts who will get to decide. Almost a decade ago, a set of conservative states sued the EPA to block it from requiring CCS. That issue has since been held in its own state of suspended animation. It may soon breathe again.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Plus news on cloud seeding, fission for fusion, and more of the week’s biggest money moves.
From beaming solar power down from space to shooting storm clouds full of particles to make it rain, this week featured progress across a range of seemingly sci-fi technologies that have actually been researched — and in some cases deployed — for decades. There were, however, few actual funding announcements to speak of, as earlier-stage climate tech venture funds continue to confront a tough fundraising environment.
First up, I explore Meta’s bet on space-based solar as a way to squeeze more output from existing solar arrays to power data centers. Then there’s the fusion startup Zap Energy, which is shifting its near-term attention toward the more established fission sector. Meanwhile, a weather modification company says it’s found a way to quantify the impact of cloud seeding — a space-age sounding practice that’s actually been in use for roughly 80 years. And amidst a string of disappointments for alternate battery chemistries, this week brings multiple wins for the sodium-ion battery sector.
One might presume that terrestrial solar paired with batteries would prove perfectly adequate for securing 24/7 clean energy moving forward, as global prices for panels and battery packs continue to fall. But the startup Overview Energy, which uses lasers to beam solar power from space directly onto existing solar arrays, thinks its space-based solar energy systems will prove valuable for powering large loads like data centers through the night. Now Meta is backing that premise, signing a first-of-its-kind agreement with Overview this week that secures early access for up to a gigawatt of capacity from the startup’s system.
Initial orbital demonstrations are slated for 2028, with commercial power delivery targeted for 2030. It’s an ambitious timeline, and certainly not the first effort to commercialize space-based solar, though prior analyses have generally concluded that while the physics check out, the economics and logistics don’t. Overview Energy thinks its found the core unlocks though: “geographic untethering,” which allows it to direct its beam to ground-based solar arrays anywhere in the world based on demand, and high-efficiency lasers capable of converting near-infrared light into electricity much more efficiently than pure sunlight.
The startup is targeting between $60 and $100 per megawatt-hour by 2035, at which point the goal is to be putting gigawatts of space solar on the grid. “It’s 5 o’clock somewhere,” Marc Berte, founder and CEO of Overview Energy, told me when I interviewed him last December. “You’re profitable at $100 bucks a megawatt-hour somewhere, instantaneously, all the time.”
Launch costs have also fallen sharply since the last serious wave of space-solar research, and Overview has already booked a 2028 launch with SpaceX. Solar power beamed from space also sidesteps two earthly constraints — land use and protracted grid interconnection timelines. So while this seemingly sci-fi vision remains unproven, it might be significantly more plausible than it once appeared. And Meta’s certainly not alone in taking that bet — Overview has already raised a $20 million seed round led by Lowercarbon Capital, Prime Movers Lab, and Engine Ventures.
Fusion startups are increasingly looking to nearer-term revenue opportunities as they work toward commercializing the Holy Grail of energy generation. Industry leader Commonwealth Fusion Systems is selling its high-temperature superconducting magnets to other developers, while other companies including Shine Technologies are generating income by producing nuclear isotopes for medical imaging. Now one startup, Zap Energy, is pushing that playbook a step further, announcing this week that it plans to develop fission reactors before putting its first fusion electrons on the grid.
Specifically, the startup is now attempting to develop small modular reactors — hardly a novel idea, as companies like Oklo, Kairos, and TerraPower have already secured significant public and private funding and struck major data center deals. Zap, however, thinks it can catch up to these new competitors in part by leveraging design commonalities between fission and fusion systems, including the use of liquid metals, engineered neutron environments, and high-power-density systems. “Fission and fusion are two expressions of the same underlying physics," Zap’s co-founder Benj Conwayby said in the press release. "This isn’t a pivot — by integrating them into a single platform, we can move faster, reduce risk, and build a more enduring company."
As the company outlines on its website, pursuing both pathways could eventually manifest in the development of a hybrid fusion-fission system, while also giving Zap practical experience interfacing with regulators and securing approvals. As The New York Times reports, the company is targeting an early 2030s timeline for its fission reactors, although Zap has yet to specify a timeline for fusion commercialization. Like so many of its peers, the company is eyeing data centers as a promising initial market, though bringing its first units online will likely require a significant influx of additional capital.
For all the concern surrounding geoengineering fixes for climate change such as solar radiation management, there’s one form of weather modification that’s been in use since the 1940s — cloud seeding. This practice typically involves flying planes into the center of storms and releasing flares that disperse a chemical called silver iodide into the clouds. This causes the water droplets within the clouds to freeze, increasing the amount of precipitation that falls as either rain or snow.
Alarming as it may sound for the uninitiated, there’s no evidence that silver iodide causes harm at current usage levels. But what has been far more difficult to pin down is efficacy — specifically, how much additional precipitation cloud seeding actually creates. That’s where the startup Rainmaker comes in. The company, which deploys unmanned drones to inject the silver iodide, says that its advanced radar and satellite systems indicate that its operations generated over 143 million gallons of additional freshwater in Oregon and Utah this year — roughly equivalent to the annual water usage of about 1,750 U.S. households. The findings have not yet been peer reviewed, but if accurate, they would make Rainmaker the first private company to quantify the impact of its cloud seeding operations.
Cloud seeding is already a well-oiled commercial business, with dozens of states, utility companies and ski resorts alike using it to increase snowfall in the drought-stricken American West and worldwide — China in particular spends tens of millions of dollars per year on the technology. Rainmaker has a particular aspiration: to help restore Utah’s Great Salt Lake, which has been shrinking since the 1980s amid rising water demand and increased evaporation driven by warmer temperatures.
In a press release, the company’s 26-year-old founder and CEO Augustus Doricko said, “With the newfound capability to measure our yields and quantify our results, Rainmaker will go forward and continue our mission to refill the Great Salt Lake, end drought in the American West and deliver water abundance wherever it is needed most around the world."
Sodium-ion batteries have long been touted as an enticing alternative — or at least complement — to lithium-ion systems for energy storage. They don’t rely on scarce and costly critical minerals like lithium, nickel, or cobalt, and have the potential to be far less flammable. The relatively nascent market also offers an opening for the U.S. to gain a foothold in this segment of the battery supply chain. But especially domestically, the industry has struggled to gain traction. Two sodium-ion startups, Natron and Bedrock Materials, both closed up shop last year as prices for lithium-iron-phosphate batteries cratered, eroding sodium-ion’s cost advantage, while the cost of manufacturing batteries in the U.S. constrained their ability to scale.
But one notable bright spot is the startup Alsym Energy, which announced this week that it has signed a letter-of-intent with long-duration energy storage company ESS Inc. for 8.5 gigawatt-hours of sodium-ion cells and modules, marking ESS’s expansion into the short and medium-duration storage market. Alsym’s CEO, Mukesh Chatter, told me this represents the largest deal for sodium-ion batteries in the U.S. to date — although it’s not yet a binding contract. Notably, it came just a day after the world’s largest-ever order for these batteries, as CATL disclosed a 60 gigawatt-hour sodium-ion agreement with energy storage integrator HyperStrong. Taken together, these partnerships suggest the sector is finally picking up durable traction both domestically and abroad.
ESS, however, is facing its own operational headwinds, nearly shuttering its Oregon manufacturing plant last year before securing an unexpected cash infusion and pivoting to a new, longer-duration storage product. Chatter remains exuberant about Alsym’s deal with the storage provider, however, telling me it represents a major proof point in terms of broader industry acceptance and an acknowledgement that “the benefits [sodium-ion] brings to the table are significant enough to overcome any stickiness” and hesitation around adopting new battery chemistries.
Chatter said that interest is now pouring in from all sides, citing inquiries from lithium-ion battery manufacturers, utilities, and defense companies and highlighting use cases ranging from data centers to apartment buildings and mining operations as likely early deployment targets.
A handful of startups are promising better, cheaper, safer water purification tech.
The need for desalination has long been clear in water-scarce regions of the planet. But with roughly a quarter of the global population now facing extreme water stress and drought conditions only projected to intensify, the technology is becoming an increasingly necessary tool for survival in a wider array of geographies.
Typically, scaling up desalination infrastructure has meant building costly, energy-intensive coastal plants that rely on a process called reverse osmosis, which involves pushing seawater through semi-permeable membranes that block salt and other contaminants, leaving only fresh water behind. Now, however, a number of startups are attempting to rework that model, with solutions that range from subsea facilities to portable desalination devices for individuals and families.
They could find potential customers across the globe. Many countries in the Middle East — including Saudi Arabia, Israel, Bahrain, Kuwait, and Qatar — rely on desalination for the bulk of their municipal water. Meanwhile, drought-prone regions from Australia to the Caribbean and California have also turned to the technology to shore up supply. But as the Iran war has underscored, this vital infrastructure is increasingly being treated as a military target, exposing a significant vulnerability in a resource relied upon by hundreds of millions.
One more resilient alternative is to move the plants underwater — making them more difficult to target while also harnessing subsurface pressure to do some of the energy-intensive work of desalination.
“I came up with the idea of using natural pressure to run the process,” Robert Bergstrom, a veteran of the water industry and CEO of the desalination startup OceanWell, told me. That meant “putting the membranes in a place where it’s already 800 pounds [of pressure] per square inch” — e.g. inside pods on the ocean floor, each capable of producing 1 million gallons of freshwater daily. By using the natural pressure of the ocean to drive the reverse osmosis process, this approach cuts energy use by about 40%, he said, thus slashing the system’s largest operating cost: electricity.
OceanWell’s design maintains a lower internal pressure within each pod than the surrounding environment, causing seawater to flow passively inside and push through membranes — just like on land, but without the high-pressure pumps. Compact pumps inside the pods then push the freshwater up a pipeline to the shore, while the resulting brine dissipates in the deep ocean.
The method also helps solve another problem with conventional desalination: environmental impact. Today’s facilities typically produce a more concentrated brine that they discharge at the ocean’s surface, which is more disruptive to marine ecosystems. The plants also frequently cause damage to organisms large and small by either trapping them against water intake screens or pulling them into the plant itself. That’s been a big sticking point when it comes to permitting these facilities, especially in California where the startup is based. OceanWell’s system, Bergstrom said, is able to filter out larger organisms while allowing microscopic ones to pass through the pods and return to the ocean.
The company began a trial last year in partnership with Las Virgenes Municipal Water District in southern California, testing its system in a freshwater reservoir full of marine life to verify its safety. Next it will test its pods in the ocean before undertaking a pilot in a to-be-determined location — California, Hawaii, and Nice in southern France are all contenders. If all goes according to plan, OceanWell will follow that up with a full-fledged commercial system targeted for 2030.
But it’s not the only startup pursuing underwater desalination — or even the one with the most aggressive timeline. Two years ago, Norwegian startup Flocean spun out of the subsea pump specialist FSubsea with a similar technical approach and a plan to deploy its first commercial system off Norway’s western coast this year. Flocean has already logged over a year of testing in the deep ocean, a stage OceanWell has yet to reach.
OceanWell thinks it can differentiate itself by meeting the unusually stringent permitting required in California. “If we can get it done in California, then the rest of the world will follow,” Bergstrom told me, meaning more resilient, more energy-efficient freshwater infrastructure for all. But it’s a high bar. The last major effort to build a desalination facility in the state led to a long-running fight that ended in 2022 with a rejection. Over 100 groups opposed the facility proposed for Orange County, citing risks to marine life, as well as high energy requirements and costs, with many arguing that alternatives — such as conservation and wastewater treatment — would be more superior options.
Megan Mauter, an associate professor of civil engineering at Stanford, thinks the groups may have a point, especially when it comes to overall system costs. The high capex of desalination can be hard to justify in California, she told me, since the state doesn’t need it 100% of the time, only in bad drought years. For example, just a few weeks ago, The Wall Street Journal reported that San Diego County’s desalination plant, by far the largest in California, now has a surplus of desalinated water that it’s looking to sell to drought-ridden Western states such as Nevada and Arizona.
And while desalination startups purport to cut overall system costs, she has her doubts about that. “The energy savings that they’re going to get are offset by some pretty high increased costs of the other elements of their plant designs,” Mauter told me. “In a subsea system, you’ve got these unproven and not mass-manufactured skids. You’ve got subsea installation, and then mooring it, and putting in pipelines that you’ve got to maintain all the way to land. You’ve got to convey water back to shore, which takes energy, and you are going to have significantly higher maintenance burdens in an open ocean environment.”
Despite her reservations, she certainly sees the appeal of non-traditional water sources, “even at costs that would have been totally infeasible a decade ago.” Municipal planners are staring down a future of worsening drought at the same time that states in the Colorado River basin remain locked in contentious negotiations over water rights, debating how to allocate cuts as river flows have declined nearly 20% since 2000. California’s narrow continental shelf also makes it an ideal environment for subsea desalination, as having deep water close to shore allows the system to harness pressure depths while minimizing the length of the pipeline needed to transport freshwater to land. Norway is also favored in this way.
“I don’t know whether the cost gaps can be solved, but I bet that the technology gaps could be solved,” Mauter told me.
Ultimately, she thinks the binding constraint is likely to be regulatory rather than technical. “Permitting is going to be a nightmare unless something fundamentally changes,” she said. Bergstrom told me that OceanWell is currently working with the California State Water Resources Control Board to revise its rules that govern desalination facilities in order to account for new technologies, though how long that process will take is anyone’s guess.
There’s one idea emerging in this ecosystem that largely sidesteps the regulatory constraints that control our land and seas. The startup Vital Lyfe has developed a portable desalination unit roughly the size of a small cooler that allows individuals and households to produce freshwater on demand with reverse osmosis — effectively decentralizing the desalination industry in the same way that the startup’s founders, former SpaceX engineers, helped decentralize internet infrastructure with Starlink.
“We’ve seen this paradigm shift coming out of Starlink that traditional, large, centralized, systems are very expensive,” Vital Lyfe CEO Jon Criss told me. “They’re hard to deploy and hard to scale up when you really need them.”
After raising a $24 million seed round in December, the startup launched its first product a few weeks ago, which retails for $750. At that price point, it’s a great deal for sailors spending days or weeks at sea, but likely too expensive for the individuals in remote communities far from water infrastructure that might need it most. Criss’s goal is to quickly iterate on this first product to bring more affordable models to the market in short order.
Portable desalination devices aren’t anything new in and of themselves — they’ve been used in military, maritime, and humanitarian scenarios for decades. The startup’s breakthrough, Criss explained, is more about manufacturing efficiency than technology. “We went all the way back, looked at why every component was designed and how to redesign it for high rate manufacturing. So we were able to substantially drop the cost of ownership and operation of these things.”
You’ll soon find Vital Lyfe’s product in big box retail stores, Criss said, though he also aims to partner with large-scale desalination facilities and utilities to help boost their output. Either way, the startup is already generating buzz — it’s seen significant inbound interest as of late, as the inherent resilience of its small system stands in sharp contrast to the vulnerability of conventional desalination infrastructure now being targeted in the Middle East.
The company is scaling up to meet the moment, building out a facility in Los Angeles county that Criss said will eventually produce 120 portable units per hour. He’s aiming to start production by summer’s end, ramping to full capacity by October. “Within the next three years we plan to account for about 10% of total membrane production at Vital Lyfe alone,” he told me, referring specifically to the production for the desalination industry.
The future of the industry, of course, could look like any combination of all of these approaches — portable devices, conventional plants on land, and modular systems at sea. What seems certain is that as the globe continues to heat up, so will desalination tech.
Why local governments are getting an earful about “infrasound”
As the data center boom pressures counties, cities, and towns into fights over noise, the trickiest tone local officials are starting to hear complaints about is one they can’t even hear – a low-frequency rumble known as infrasound.
Infrasound is a phenomenon best described as sounds so low, they’re inaudible. These are the sorts of vibrations and pressure at the heart of earthquakes and volcanic activity. Infrasound can be anything from the waves shot out from a sonic boom or an explosion to very minute changes in air pressure around HVAC systems or refrigerators.
Knowing some of these facilities also have the capacity to produce significant audible noise, growing segments of the population’s more tech-skeptical and health-anxious corners are fretting some data centers could be making a lot of infrasound, too. The whizzing of so many large computational machines combined with cooling fans and other large devices creating so many new columns of air flow. Add onto that any rotational onsite power generation – think natural gas turbines, for example – and you get quite a lot of movement that could potentially produce what they say is infrasound.
Some of the virality of this chatter about infrasound and data centers comes from a video about infrasound created by audio engineer and researcher Benn Jordan. Currently sitting at more than 1 million views, this short YouTube film documents claims that some data centers are operating like “acoustic weapons” through infrasound and harming people. Andy Masley, an “effective altruist” writer, has become the chief critic of the Jordan video, getting into a back-and-forth that’s raised the issue to Internet discourse territory.
The Jordan-Masley infrasound debate is honestly a bit of a mess. So I want to be clear: I’m not going to get into the science of whether or not infrasound poses any kind of public health risk in this article. We can get to that later. It’s worth saying that this subject may need more study and that work is ongoing. Also, talking about infrasound at all can make you honestly sound a little wacky (see: this study blaming people seeing ghosts on infrasound). It might also remind you of another panic in the Electric Age: electromagnetic fields, also known as EMFs. Developers of transmission lines and solar projects have long had to deal with people worried about transmission lines and large electrical equipment potentially glowing with invisible, unhealthy radiation.
In late 2024, I wrote about how an RFK Jr. supporter worried about this form of electrical emission was helping lead the fight against a transmission line in New Jersey for offshore wind. Maybe that’s why it didn’t surprise me one bit when the Health and Human Services secretary himself told a U.S. Senate Committee last week that he was asking the Surgeon General’s office to “do either meta reviews” or “base studies” on noise pollution and EMF radiation from data centers “so we can better inform the American public.”
“There’s a range of injuries that are very, very well documented. They’re neurological – very, very grave neurological injuries, cancer risk,” Kennedy Jr. told the Senate Health, Education, Labor and Pensions Committee on April 22 in response to a request from Sen. Josh Hawley of Missouri to study the issue. “The risks, to me, are tremendous.”
There’s also the unfortunate reality that infrasound impacts have previously been a cudgel to slow down renewable energy deployment. Wind turbines create infrasound because of the subharmonic frequencies created when one turbine rotates at a slightly different pace than another, producing a slightly dissonant low frequency noise. Groups like the Heartland Institute proudly list this infrasound as one of the reasons wind energy “menaces man and nature.”
But regardless of merit, this concern is already impacting local government decisions around data center projects, much like how one Michigan county sought to restrict solar energy on the same basis.
In February Adrian Shelley, the Texas director for environmental group Public Citizen, implored the city of Red Rock to study changing their noise ordinance to take into account infrasound. “It has effects on sleep patterns, on stress, on cardiovascular health, and it is potentially a very serious concern,” Shelley said at a February 11 city council discussion on data center rules. “It will not be covered by the city’s noise ordinance, which only deals with audible sound.”
Earlier this month in Calvert County, Maryland, a volunteer for their environmental commission recently told the county government that infrasound needs to be factored into their future data center planning. “It will have significant impacts on our region and the Chesapeake and the Patuxent because infrasound isn’t stopped by walls,” commission member Janette Wysocki, a proud land conservationist, said at an April 15 hearing. “It will keep going, it will move through anything. It’s a very long wavelength. So we need to protect our ecosystem.” Wysocki implored the county to consider whether to adjust its noise regulations.
Around the same time, similar concerns were raised in Lebanon, a small city in east-central Pennsylvania. “It permeates through concrete walls, it permeates through the ground,” Thomas Dompier, an associate professor at Lebanon Valley College, said at an April 16 Lebanon County commission hearing on data centers.
Lastly, last week I explained how Loudon County wants to rethink its noise ordinance to deal with low-frequency “hums” from data centers – a concern echoing those who fret infrasound.
Ethan Bourdeau, executive director of standards at Quiet Parks Intentional and a career acoustician and building standards writer, told me that what makes data centers unique is the “constant drone” of noise that could potentially carry subharmonic frequencies. Bourdeau said cities or counties could possibly factor concerns about infrasound into noise ordinances to address those who are most concerned. One way they could do it is by changing how decibels are weighted in the government’s measurements. A-weighting decibel meters are a common form of sound measurement geared toward perceptible noise. Using different systems, like C-weighting or G-weighting, would avoid ways that A-weighting can filter out sub-hearing frequencies.
“These are reporting and weighting systems where a sound level meter taking background noise receives all the unweighted sound and then you apply all these filters afterwards, like an EQ curve,” Bourdeau said.
So I guess if those most concerned about infrasound have their way, a lot of country commissioners and local elected leaders will be heading to the mixing booth.