You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
Utilities in the Southeast, especially, may have to rethink.

Utilities all over the country have proposed to build a slew of new natural gas-fired power plants in recent months, citing an anticipated surge in electricity demand from data centers, manufacturing, and electric vehicles. But on Thursday, the Environmental Protection Agency finalized new emissions limits on power plants that throw many of those plans into question.
The rules require that newly built natural gas plants that are designed to help meet the grid’s daily, minimum needs, will have to slash their carbon emissions by 90% by 2032, an amount that can only be achieved with the use of carbon capture equipment. But carbon capture will be cost-prohibitive in many cases — especially in the Southeast, where much of that expected demand growth is concentrated, but which lacks the geology necessary to store captured carbon underground.
“With this rule, it’s kind of back to square one,” Tyler Norris, an electric power systems researcher, told me. “I think most likely, you're gonna see the regulators really push back and call upon them to redo all their modeling.”
This is the first federal mandate to curb carbon from the electricity sector since President Obama’s 2015 Clean Power Plan, which never went into effect. Despite growing investment in renewable energy, power generation is responsible for about a quarter of the country’s greenhouse gas emissions.
The Biden administration is guaranteed to face legal challenges from Republican attorneys general and electric utilities. The Edison Electric Institute, the largest trade group for electric utilities, asserted that carbon capture “is not yet ready for full-scale, economy-wide deployment” and expressed worry over the timelines for permitting and financing. Duke Energy, one of the Southeast’s largest utilities, issued a statement after the rule came out saying that it “presents significant challenges to customer reliability and affordability – as well as limits the potential of our ability to be a global leader in chips, artificial intelligence and advanced manufacturing,” echoing concerns from the National Rural Electric Cooperative Association. The EPA, however, maintains that recent federal investments in carbon capture — including an $85 tax credit for every ton of CO2 captured and stored — render it both “technically feasible and cost-reasonable.”
As part of the same announcement on Thursday, the Environmental Protection Agency finalized several additional regulations to rein in air and water pollution from coal-fired power plants, including mercury and toxic metals, wastewater, and coal ash, in addition to carbon emissions. During a call with reporters on Wednesday, EPA administrator Michael Regan argued that by finalizing all of these rules at once, the agency was providing the highest degree of regulatory certainty for the power industry. “This approach is both strategic and innovative,” he said. “We are ensuring that the power sector has the information needed to prepare for the future with confidence, enabling strong investment and planning decisions.”
Initially the EPA was going to require emissions cuts at existing natural gas plants, too, but the agency announced in February that it was delaying that rule in order to develop a “stronger, more durable approach.” EPA officials offered no new details on the timeline on Wednesday.
The two other biggest changes the agency made between the proposed and final rules were to push forward and shorten the timeline for coal plant compliance, and to lower the threshold determining how many natural gas plants have to meet the toughest standard — which means more plants will have to control their emissions.
The agency projects the new standards will prevent a total of nearly 1.4 billion metric tons of carbon emissions through 2047, which is about equal to the amount the power sector emits in a year. That’s significant, but it’s far less than the clean car rules the EPA finalized in March, which are expected to avoid 7.2 billion metric tons of carbon between 2027 and 2055. The EPA also estimates that the power plant rules will produce $370 billion in climate and health benefits over the next two decades, in terms of avoided deaths, hospital visits, and asthma cases.
The new emissions limits for coal plants are tied to how much longer a given coal plant is slated to operate. Those that plan to shut down before 2032 are exempt altogether. Those that plan to retire by 2039 have to reduce the amount of CO2 they emit per megawatt hour by replacing some of the coal they burn with natural gas beginning in 2030. Coal plants with no plans to retire before 2039 are subject to the highest standard, requiring a 90% drop in emissions by 2032 — which would require capturing the emissions and storing them underground.
These standards are certain to lead to more plant closures, but coal plants are already shutting down at a rapid pace purely based on economics and the fact that so many of them are so old. Getting the rules in place is less about tackling coal emissions, per se, and more about “getting utilities thinking more proactive about how they are going to replace these coal plants,” Michelle Solomon, a senior policy advisor at the nonprofit think tank Energy Innovation, told me.
Gas, however, is another story. Utilities have been sounding the alarm about a coming surge in electricity demand. Electric companies throughout the Southeast, as well as Texas, Wisconsin, and elsewhere, have proposed building dozens of new natural gas plants, arguing that renewables and batteries aren’t up to the task of providing a reliable, dispatchable source of power.
Whether that coming demand is real or inflated is a matter of debate. But regardless, clean energy researchers and advocates dispute the idea that gas plants are needed for reliability.
“Utilities are seeing an additional need for peak capacity, not an additional need for capacity throughout the day,” Solomon told me, asserting it was possible to meet those peaks with solar and storage, or even by improving efficiency so that the peaks aren’t as high. The trick is making sure we can bring those resources online fast enough. To that end, the Department of Energy also announced a number of initiatives to boost transmission infrastructure on Thursday.
The EPA’s regulations for new gas plants are tied to how frequently they are intended to operate. Plants that are designed to switch on during times of peak demand — a variety called a “simple cycle” combustion turbine plant — won’t have to do anything differently. Plants that run a bit more often — so-called “intermediate” resources that might run daily from mid-morning till the evening, at 20% to 40% of their annual capacity — will be required to install the most efficient equipment available on the market. Any that operate more frequently than that will be subject to the 90% emissions reduction standard by 2032. This primarily affects “combined cycle” plants, which are more efficient than simple cycle but can’t ramp up and down as quickly or easily.
Utilities with recently hatched plans to build simple cycle plants, including Georgia Power, are unlikely to be affected by the rule at all. “I do think that makes sense, given the focus of these rules, which are on carbon emissions,” Amanda Levin, a director of policy analysis at the Natural Resources Defense Council, told me. “Given the frequency and type of operation for [simple cycle], they’re not as significant as sources of CO2.”
But those utilities that are planning to build combined cycle projects — and many of them are — could be forced to go back to the drawing board. Norris noted that Duke Energy, which serves customers in North and South Carolina and has proposed building more than 6 gigawatts of combined cycle capacity, will be especially exposed.
For combined cycle plants, there are essentially two options to comply: Install carbon capture, or plan to run your plant a lot less frequently. In either case, it “dramatically increases the levelized cost of those units,” Norris told me. “So I think any reasonable regulator would say we've got to go back and do a much more rigorous comparative analysis to other least-cost solutions.”
Solomon has a more cynical view of the recent panic over electricity demand and rush to build new gas plants. “We’ve known that demand is growing, is going to grow, for a long time,” she told me. “The fact that there’s quite a lot of news about this just as the rules are coming out is unlikely to be a total coincidence.”
Editor’s note: This story has been updated to reflect statements from Duke Energy and trade groups.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Plus news on cloud seeding, fission for fusion, and more of the week’s biggest money moves.
From beaming solar power down from space to shooting storm clouds full of particles to make it rain, this week featured progress across a range of seemingly sci-fi technologies that have actually been researched — and in some cases deployed — for decades. There were, however, few actual funding announcements to speak of, as earlier-stage climate tech venture funds continue to confront a tough fundraising environment.
First up, I explore Meta’s bet on space-based solar as a way to squeeze more output from existing solar arrays to power data centers. Then there’s the fusion startup Zap Energy, which is shifting its near-term attention toward the more established fission sector. Meanwhile, a weather modification company says it’s found a way to quantify the impact of cloud seeding — a space-age sounding practice that’s actually been in use for roughly 80 years. And amidst a string of disappointments for alternate battery chemistries, this week brings multiple wins for the sodium-ion battery sector.
One might presume that terrestrial solar paired with batteries would prove perfectly adequate for securing 24/7 clean energy moving forward, as global prices for panels and battery packs continue to fall. But the startup Overview Energy, which uses lasers to beam solar power from space directly onto existing solar arrays, thinks its space-based solar energy systems will prove valuable for powering large loads like data centers through the night. Now Meta is backing that premise, signing a first-of-its-kind agreement with Overview this week that secures early access for up to a gigawatt of capacity from the startup’s system.
Initial orbital demonstrations are slated for 2028, with commercial power delivery targeted for 2030. It’s an ambitious timeline, and certainly not the first effort to commercialize space-based solar, though prior analyses have generally concluded that while the physics check out, the economics and logistics don’t. Overview Energy thinks its found the core unlocks though: “geographic untethering,” which allows it to direct its beam to ground-based solar arrays anywhere in the world based on demand, and high-efficiency lasers capable of converting near-infrared light into electricity much more efficiently than pure sunlight.
The startup is targeting between $60 and $100 per megawatt-hour by 2035, at which point the goal is to be putting gigawatts of space solar on the grid. “It’s 5 o’clock somewhere,” Marc Berte, founder and CEO of Overview Energy, told me when I interviewed him last December. “You’re profitable at $100 bucks a megawatt-hour somewhere, instantaneously, all the time.”
Launch costs have also fallen sharply since the last serious wave of space-solar research, and Overview has already booked a 2028 launch with SpaceX. Solar power beamed from space also sidesteps two earthly constraints — land use and protracted grid interconnection timelines. So while this seemingly sci-fi vision remains unproven, it might be significantly more plausible than it once appeared. And Meta’s certainly not alone in taking that bet — Overview has already raised a $20 million seed round led by Lowercarbon Capital, Prime Movers Lab, and Engine Ventures.
Fusion startups are increasingly looking to nearer-term revenue opportunities as they work toward commercializing the Holy Grail of energy generation. Industry leader Commonwealth Fusion Systems is selling its high-temperature superconducting magnets to other developers, while other companies including Shine Technologies are generating income by producing nuclear isotopes for medical imaging. Now one startup, Zap Energy, is pushing that playbook a step further, announcing this week that it plans to develop fission reactors before putting its first fusion electrons on the grid.
Specifically, the startup is now attempting to develop small modular reactors — hardly a novel idea, as companies like Oklo, Kairos, and TerraPower have already secured significant public and private funding and struck major data center deals. Zap, however, thinks it can catch up to these new competitors in part by leveraging design commonalities between fission and fusion systems, including the use of liquid metals, engineered neutron environments, and high-power-density systems. “Fission and fusion are two expressions of the same underlying physics," Zap’s co-founder Benj Conwayby said in the press release. "This isn’t a pivot — by integrating them into a single platform, we can move faster, reduce risk, and build a more enduring company."
As the company outlines on its website, pursuing both pathways could eventually manifest in the development of a hybrid fusion-fission system, while also giving Zap practical experience interfacing with regulators and securing approvals. As The New York Times reports, the company is targeting an early 2030s timeline for its fission reactors, although Zap has yet to specify a timeline for fusion commercialization. Like so many of its peers, the company is eyeing data centers as a promising initial market, though bringing its first units online will likely require a significant influx of additional capital.
For all the concern surrounding geoengineering fixes for climate change such as solar radiation management, there’s one form of weather modification that’s been in use since the 1940s — cloud seeding. This practice typically involves flying planes into the center of storms and releasing flares that disperse a chemical called silver iodide into the clouds. This causes the water droplets within the clouds to freeze, increasing the amount of precipitation that falls as either rain or snow.
Alarming as it may sound for the uninitiated, there’s no evidence that silver iodide causes harm at current usage levels. But what has been far more difficult to pin down is efficacy — specifically, how much additional precipitation cloud seeding actually creates. That’s where the startup Rainmaker comes in. The company, which deploys unmanned drones to inject the silver iodide, says that its advanced radar and satellite systems indicate that its operations generated over 143 million gallons of additional freshwater in Oregon and Utah this year — roughly equivalent to the annual water usage of about 1,750 U.S. households. The findings have not yet been peer reviewed, but if accurate, they would make Rainmaker the first private company to quantify the impact of its cloud seeding operations.
Cloud seeding is already a well-oiled commercial business, with dozens of states, utility companies and ski resorts alike using it to increase snowfall in the drought-stricken American West and worldwide — China in particular spends tens of millions of dollars per year on the technology. Rainmaker has a particular aspiration: to help restore Utah’s Great Salt Lake, which has been shrinking since the 1980s amid rising water demand and increased evaporation driven by warmer temperatures.
In a press release, the company’s 26-year-old founder and CEO Augustus Doricko said, “With the newfound capability to measure our yields and quantify our results, Rainmaker will go forward and continue our mission to refill the Great Salt Lake, end drought in the American West and deliver water abundance wherever it is needed most around the world."
Sodium-ion batteries have long been touted as an enticing alternative — or at least complement — to lithium-ion systems for energy storage. They don’t rely on scarce and costly critical minerals like lithium, nickel, or cobalt, and have the potential to be far less flammable. The relatively nascent market also offers an opening for the U.S. to gain a foothold in this segment of the battery supply chain. But especially domestically, the industry has struggled to gain traction. Two sodium-ion startups, Natron and Bedrock Materials, both closed up shop last year as prices for lithium-iron-phosphate batteries cratered, eroding sodium-ion’s cost advantage, while the cost of manufacturing batteries in the U.S. constrained their ability to scale.
But one notable bright spot is the startup Alsym Energy, which announced this week that it has signed a letter-of-intent with long-duration energy storage company ESS Inc. for 8.5 gigawatt-hours of sodium-ion cells and modules, marking ESS’s expansion into the short and medium-duration storage market. Alsym’s CEO, Mukesh Chatter, told me this represents the largest deal for sodium-ion batteries in the U.S. to date — although it’s not yet a binding contract. Notably, it came just a day after the world’s largest-ever order for these batteries, as CATL disclosed a 60 gigawatt-hour sodium-ion agreement with energy storage integrator HyperStrong. Taken together, these partnerships suggest the sector is finally picking up durable traction both domestically and abroad.
ESS, however, is facing its own operational headwinds, nearly shuttering its Oregon manufacturing plant last year before securing an unexpected cash infusion and pivoting to a new, longer-duration storage product. Chatter remains exuberant about Alsym’s deal with the storage provider, however, telling me it represents a major proof point in terms of broader industry acceptance and an acknowledgement that “the benefits [sodium-ion] brings to the table are significant enough to overcome any stickiness” and hesitation around adopting new battery chemistries.
Chatter said that interest is now pouring in from all sides, citing inquiries from lithium-ion battery manufacturers, utilities, and defense companies and highlighting use cases ranging from data centers to apartment buildings and mining operations as likely early deployment targets.
A handful of startups are promising better, cheaper, safer water purification tech.
The need for desalination has long been clear in water-scarce regions of the planet. But with roughly a quarter of the global population now facing extreme water stress and drought conditions only projected to intensify, the technology is becoming an increasingly necessary tool for survival in a wider array of geographies.
Typically, scaling up desalination infrastructure has meant building costly, energy-intensive coastal plants that rely on a process called reverse osmosis, which involves pushing seawater through semi-permeable membranes that block salt and other contaminants, leaving only fresh water behind. Now, however, a number of startups are attempting to rework that model, with solutions that range from subsea facilities to portable desalination devices for individuals and families.
They could find potential customers across the globe. Many countries in the Middle East — including Saudi Arabia, Israel, Bahrain, Kuwait, and Qatar — rely on desalination for the bulk of their municipal water. Meanwhile, drought-prone regions from Australia to the Caribbean and California have also turned to the technology to shore up supply. But as the Iran war has underscored, this vital infrastructure is increasingly being treated as a military target, exposing a significant vulnerability in a resource relied upon by hundreds of millions.
One more resilient alternative is to move the plants underwater — making them more difficult to target while also harnessing subsurface pressure to do some of the energy-intensive work of desalination.
“I came up with the idea of using natural pressure to run the process,” Robert Bergstrom, a veteran of the water industry and CEO of the desalination startup OceanWell, told me. That meant “putting the membranes in a place where it’s already 800 pounds [of pressure] per square inch” — e.g. inside pods on the ocean floor, each capable of producing 1 million gallons of freshwater daily. By using the natural pressure of the ocean to drive the reverse osmosis process, this approach cuts energy use by about 40%, he said, thus slashing the system’s largest operating cost: electricity.
OceanWell’s design maintains a lower internal pressure within each pod than the surrounding environment, causing seawater to flow passively inside and push through membranes — just like on land, but without the high-pressure pumps. Compact pumps inside the pods then push the freshwater up a pipeline to the shore, while the resulting brine dissipates in the deep ocean.
The method also helps solve another problem with conventional desalination: environmental impact. Today’s facilities typically produce a more concentrated brine that they discharge at the ocean’s surface, which is more disruptive to marine ecosystems. The plants also frequently cause damage to organisms large and small by either trapping them against water intake screens or pulling them into the plant itself. That’s been a big sticking point when it comes to permitting these facilities, especially in California where the startup is based. OceanWell’s system, Bergstrom said, is able to filter out larger organisms while allowing microscopic ones to pass through the pods and return to the ocean.
The company began a trial last year in partnership with Las Virgenes Municipal Water District in southern California, testing its system in a freshwater reservoir full of marine life to verify its safety. Next it will test its pods in the ocean before undertaking a pilot in a to-be-determined location — California, Hawaii, and Nice in southern France are all contenders. If all goes according to plan, OceanWell will follow that up with a full-fledged commercial system targeted for 2030.
But it’s not the only startup pursuing underwater desalination — or even the one with the most aggressive timeline. Two years ago, Norwegian startup Flocean spun out of the subsea pump specialist FSubsea with a similar technical approach and a plan to deploy its first commercial system off Norway’s western coast this year. Flocean has already logged over a year of testing in the deep ocean, a stage OceanWell has yet to reach.
OceanWell thinks it can differentiate itself by meeting the unusually stringent permitting required in California. “If we can get it done in California, then the rest of the world will follow,” Bergstrom told me, meaning more resilient, more energy-efficient freshwater infrastructure for all. But it’s a high bar. The last major effort to build a desalination facility in the state led to a long-running fight that ended in 2022 with a rejection. Over 100 groups opposed the facility proposed for Orange County, citing risks to marine life, as well as high energy requirements and costs, with many arguing that alternatives — such as conservation and wastewater treatment — would be more superior options.
Megan Mauter, an associate professor of civil engineering at Stanford, thinks the groups may have a point, especially when it comes to overall system costs. The high capex of desalination can be hard to justify in California, she told me, since the state doesn’t need it 100% of the time, only in bad drought years. For example, just a few weeks ago, The Wall Street Journal reported that San Diego County’s desalination plant, by far the largest in California, now has a surplus of desalinated water that it’s looking to sell to drought-ridden Western states such as Nevada and Arizona.
And while desalination startups purport to cut overall system costs, she has her doubts about that. “The energy savings that they’re going to get are offset by some pretty high increased costs of the other elements of their plant designs,” Mauter told me. “In a subsea system, you’ve got these unproven and not mass-manufactured skids. You’ve got subsea installation, and then mooring it, and putting in pipelines that you’ve got to maintain all the way to land. You’ve got to convey water back to shore, which takes energy, and you are going to have significantly higher maintenance burdens in an open ocean environment.”
Despite her reservations, she certainly sees the appeal of non-traditional water sources, “even at costs that would have been totally infeasible a decade ago.” Municipal planners are staring down a future of worsening drought at the same time that states in the Colorado River basin remain locked in contentious negotiations over water rights, debating how to allocate cuts as river flows have declined nearly 20% since 2000. California’s narrow continental shelf also makes it an ideal environment for subsea desalination, as having deep water close to shore allows the system to harness pressure depths while minimizing the length of the pipeline needed to transport freshwater to land. Norway is also favored in this way.
“I don’t know whether the cost gaps can be solved, but I bet that the technology gaps could be solved,” Mauter told me.
Ultimately, she thinks the binding constraint is likely to be regulatory rather than technical. “Permitting is going to be a nightmare unless something fundamentally changes,” she said. Bergstrom told me that OceanWell is currently working with the California State Water Resources Control Board to revise its rules that govern desalination facilities in order to account for new technologies, though how long that process will take is anyone’s guess.
There’s one idea emerging in this ecosystem that largely sidesteps the regulatory constraints that control our land and seas. The startup Vital Lyfe has developed a portable desalination unit roughly the size of a small cooler that allows individuals and households to produce freshwater on demand with reverse osmosis — effectively decentralizing the desalination industry in the same way that the startup’s founders, former SpaceX engineers, helped decentralize internet infrastructure with Starlink.
“We’ve seen this paradigm shift coming out of Starlink that traditional, large, centralized, systems are very expensive,” Vital Lyfe CEO Jon Criss told me. “They’re hard to deploy and hard to scale up when you really need them.”
After raising a $24 million seed round in December, the startup launched its first product a few weeks ago, which retails for $750. At that price point, it’s a great deal for sailors spending days or weeks at sea, but likely too expensive for the individuals in remote communities far from water infrastructure that might need it most. Criss’s goal is to quickly iterate on this first product to bring more affordable models to the market in short order.
Portable desalination devices aren’t anything new in and of themselves — they’ve been used in military, maritime, and humanitarian scenarios for decades. The startup’s breakthrough, Criss explained, is more about manufacturing efficiency than technology. “We went all the way back, looked at why every component was designed and how to redesign it for high rate manufacturing. So we were able to substantially drop the cost of ownership and operation of these things.”
You’ll soon find Vital Lyfe’s product in big box retail stores, Criss said, though he also aims to partner with large-scale desalination facilities and utilities to help boost their output. Either way, the startup is already generating buzz — it’s seen significant inbound interest as of late, as the inherent resilience of its small system stands in sharp contrast to the vulnerability of conventional desalination infrastructure now being targeted in the Middle East.
The company is scaling up to meet the moment, building out a facility in Los Angeles county that Criss said will eventually produce 120 portable units per hour. He’s aiming to start production by summer’s end, ramping to full capacity by October. “Within the next three years we plan to account for about 10% of total membrane production at Vital Lyfe alone,” he told me, referring specifically to the production for the desalination industry.
The future of the industry, of course, could look like any combination of all of these approaches — portable devices, conventional plants on land, and modular systems at sea. What seems certain is that as the globe continues to heat up, so will desalination tech.
Why local governments are getting an earful about “infrasound”
As the data center boom pressures counties, cities, and towns into fights over noise, the trickiest tone local officials are starting to hear complaints about is one they can’t even hear – a low-frequency rumble known as infrasound.
Infrasound is a phenomenon best described as sounds so low, they’re inaudible. These are the sorts of vibrations and pressure at the heart of earthquakes and volcanic activity. Infrasound can be anything from the waves shot out from a sonic boom or an explosion to very minute changes in air pressure around HVAC systems or refrigerators.
Knowing some of these facilities also have the capacity to produce significant audible noise, growing segments of the population’s more tech-skeptical and health-anxious corners are fretting some data centers could be making a lot of infrasound, too. The whizzing of so many large computational machines combined with cooling fans and other large devices creating so many new columns of air flow. Add onto that any rotational onsite power generation – think natural gas turbines, for example – and you get quite a lot of movement that could potentially produce what they say is infrasound.
Some of the virality of this chatter about infrasound and data centers comes from a video about infrasound created by audio engineer and researcher Benn Jordan. Currently sitting at more than 1 million views, this short YouTube film documents claims that some data centers are operating like “acoustic weapons” through infrasound and harming people. Andy Masley, an “effective altruist” writer, has become the chief critic of the Jordan video, getting into a back-and-forth that’s raised the issue to Internet discourse territory.
The Jordan-Masley infrasound debate is honestly a bit of a mess. So I want to be clear: I’m not going to get into the science of whether or not infrasound poses any kind of public health risk in this article. We can get to that later. It’s worth saying that this subject may need more study and that work is ongoing. Also, talking about infrasound at all can make you honestly sound a little wacky (see: this study blaming people seeing ghosts on infrasound). It might also remind you of another panic in the Electric Age: electromagnetic fields, also known as EMFs. Developers of transmission lines and solar projects have long had to deal with people worried about transmission lines and large electrical equipment potentially glowing with invisible, unhealthy radiation.
In late 2024, I wrote about how an RFK Jr. supporter worried about this form of electrical emission was helping lead the fight against a transmission line in New Jersey for offshore wind. Maybe that’s why it didn’t surprise me one bit when the Health and Human Services secretary himself told a U.S. Senate Committee last week that he was asking the Surgeon General’s office to “do either meta reviews” or “base studies” on noise pollution and EMF radiation from data centers “so we can better inform the American public.”
“There’s a range of injuries that are very, very well documented. They’re neurological – very, very grave neurological injuries, cancer risk,” Kennedy Jr. told the Senate Health, Education, Labor and Pensions Committee on April 22 in response to a request from Sen. Josh Hawley of Missouri to study the issue. “The risks, to me, are tremendous.”
There’s also the unfortunate reality that infrasound impacts have previously been a cudgel to slow down renewable energy deployment. Wind turbines create infrasound because of the subharmonic frequencies created when one turbine rotates at a slightly different pace than another, producing a slightly dissonant low frequency noise. Groups like the Heartland Institute proudly list this infrasound as one of the reasons wind energy “menaces man and nature.”
But regardless of merit, this concern is already impacting local government decisions around data center projects, much like how one Michigan county sought to restrict solar energy on the same basis.
In February Adrian Shelley, the Texas director for environmental group Public Citizen, implored the city of Red Rock to study changing their noise ordinance to take into account infrasound. “It has effects on sleep patterns, on stress, on cardiovascular health, and it is potentially a very serious concern,” Shelley said at a February 11 city council discussion on data center rules. “It will not be covered by the city’s noise ordinance, which only deals with audible sound.”
Earlier this month in Calvert County, Maryland, a volunteer for their environmental commission recently told the county government that infrasound needs to be factored into their future data center planning. “It will have significant impacts on our region and the Chesapeake and the Patuxent because infrasound isn’t stopped by walls,” commission member Janette Wysocki, a proud land conservationist, said at an April 15 hearing. “It will keep going, it will move through anything. It’s a very long wavelength. So we need to protect our ecosystem.” Wysocki implored the county to consider whether to adjust its noise regulations.
Around the same time, similar concerns were raised in Lebanon, a small city in east-central Pennsylvania. “It permeates through concrete walls, it permeates through the ground,” Thomas Dompier, an associate professor at Lebanon Valley College, said at an April 16 Lebanon County commission hearing on data centers.
Lastly, last week I explained how Loudon County wants to rethink its noise ordinance to deal with low-frequency “hums” from data centers – a concern echoing those who fret infrasound.
Ethan Bourdeau, executive director of standards at Quiet Parks Intentional and a career acoustician and building standards writer, told me that what makes data centers unique is the “constant drone” of noise that could potentially carry subharmonic frequencies. Bourdeau said cities or counties could possibly factor concerns about infrasound into noise ordinances to address those who are most concerned. One way they could do it is by changing how decibels are weighted in the government’s measurements. A-weighting decibel meters are a common form of sound measurement geared toward perceptible noise. Using different systems, like C-weighting or G-weighting, would avoid ways that A-weighting can filter out sub-hearing frequencies.
“These are reporting and weighting systems where a sound level meter taking background noise receives all the unweighted sound and then you apply all these filters afterwards, like an EQ curve,” Bourdeau said.
So I guess if those most concerned about infrasound have their way, a lot of country commissioners and local elected leaders will be heading to the mixing booth.