You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
Skiers, snowboarders, and cross-country athletes are in mourning for snow.

On January 15, as the first major winter storm of the season screeched across the U.S., Minneapolis’ Theodore Wirth Regional Park remained cold, hard, and — most stubbornly — brown. “We continue to be denied any measurable amount of snow,” read the park’s trail report for the day. “Frozen dandruff-covered dirt is our destiny for the time being.”
In a few weeks, over Presidents Day weekend, the park is scheduled to host the United States’ first cross country skiing World Cup in more than 20 years. For an event like that, “dandruff-covered dirt” simply will not cut it. “We’re really excited to have a great event there with tons of friends and family,” Gus Schumacher, a 2022 Winter Olympian in skiathlon, told me. While he still has hope, the Twin Cities’ snow deficit remains around 18 inches for the season. “We have to cross our fingers for some winter in the next month,” he said.
For the 30 million Americans who enjoy snow sports every year, this sort of finger-crossing has become as much of a pre-season ritual as tightening bindings and waxing skis. While scientists have long taken note of dwindling snowpack — the Fifth National Climate Assessment, released last year, specifically cited winter recreation as a pending cultural and economic victim of climate change — data had only shakily linked snow level to human-driven warming until recently. This month, a study published in Nature confirmed that it’s not all in our heads: Some parts of the U.S. are losing 10% to 20% of their snowpack per decade because of anthropogenic climate change.
Perhaps even more concerning, the study’s authors found that snow loss has a tipping point: Once the average winter temperature in a region warms beyond 17 degrees Fahrenheit (-8 degrees Celsius), snow loss rapidly accelerates, even with small temperature rises.
In spite of headlines about arctic blasts and photos of buried football fields, snow levels in many parts of the country have remained worryingly low at the midpoint of this year’s meteorological winter — and temperatures, on average, remain high. In early January, most ski areas in the U.S. were only operating half of their lifts, “which is unusual for this time of year,” Chance Keso, a senior news producer for On the Snow, which tracks ski conditions, told me. “Typically,” he explained, “we would see most resorts almost all completely open by this time of year.”
The recent storm systems have helped somewhat, Keso said — Alyeska, a ski area in Alaska, “passed the 400 inches mark a few weeks ago.” But even Buffalo, which received record snow in January, is tracking behind average when the whole season is considered. In California, where the ski industry is a $1.6 billion business, snowpack is only 57% of normal.
Likewise, meteorologist Sven Sundgaard wrote for Minneapolis’ Bring Me the News that this winter has been “pretty weak” in Minnesota. It has been cold, no doubt, and yet “nowhere in the state reached 25 [degrees Fahrenheit] below zero, which should EASILY happen in a January cold snap in northern Minnesota, even in our much warmer climate,” he said. (This week, temperatures are expected to be 10 to 15 degrees above normal across the state.) On the Snow reported that, as of Monday, “snowpack levels across Minnesota are currently 73% of normal.”
Counterintuitive as it may be, researchers expect climate change to bring more snow to certain places, as extremely cold parts of the world warm to more snow-friendly temperatures and increased precipitation from a warmer atmosphere results in more flurries. Parts of Siberia and the northern Great Plains appear to be experiencing a deepening snowpack of over 20% per decade, Justin Mankin and Alexander Gottlieb, the co-authors of the Nature paper, found in their research. But just because snow loss hasn’t hit an area yet doesn’t mean it won’t soon; “basins that are hovering right at the edge of that cliff, for whom major snow losses have not yet emerged, are about to see the snow losses emerge,” Mankin said.
Despite the worries about Minnesota’s upcoming World Cup, Susanna Sieff — the sustainability director for the Switzerland-based International Ski and Snowboard Federation (known by its French initials, FIS) — told me that event cancellations for the six Olympic snow sport disciplines this season have so far “been on par with previous seasons.” A spate of foiled World Cups in Zermatt, Italy, Beaver Creek, Colorado, and the French Alps in late 2023, she said, was “due to inclement weather and not lack of snowfall.”
Still, Sieff admitted that “for those that needed a wake-up call, the last few years have certainly provided it.” 2022 was especially bad for competitive ski and snowboarding — the organization canceled seven of its eight early-season World Cups for lack of snow. This month, FIS released an updated sustainability action plan that runs through the 2026 season and includes a particular focus on mitigation, environmental justice, and responsible stewardship. (Protect Our Winters, an environmental advocacy group that put me in touch with Schumacher, the ski athlete who serves as one of their ambassadors, has pressured FIS to be more transparent given the existential crisis facing competitive snow sports. My father is a longtime FIS event volunteer.)
Resort operators are increasingly using machine-made snow as a fall-back plan — as Schumacher told me, in cross-country, “we ski on warm, manmade snow far more than was the case 10 years ago.” It’s also common for XC events to move to alternate venues where snow can be stretched further. For example, Lillehammer, Norway has hosted a World Cup race in nine of the past 10 years. But “since I came on the World Cup in 2020, we haven’t been able to use the marquee trails built for the 1994 Olympics,” Schumacher said.
Even this “fake” snow is imperiled. “Snowmaking is not a climate solution,” the National Ski Areas Association, an industry group, has made clear. “It is an operational tool.”
It’s also expensive. Snowmaking can eat up to 15% of a ski area’s operating budget, draining the pockets of small and independent resorts. The consequence is yet another illustration of how climate change hits “the most vulnerable system and the most vulnerable people in that system,” Mankin said. “The ski industry is a really clear example of where you’re going to see consolidation onto better resourced, higher, more exclusive mountains that have the ability to produce human-made snow — and which are more difficult for the general population to access.”
Since the 1970s, ski areas in the U.S. have dwindled from roughly 1,000 locations to only about 470, according to SnowBrains, a ski and snowboard publication. It’s a trend climate change is helping to accelerate. That, of course, means fewer areas for athletes to compete and practice, as well as fewer local hills and trails for would-be athletes to fall in love with the sport.
For those in the snow sports world, this is nothing short of heartbreaking. The average American already doesn’t watch snow sports and “shouldn’t really care” whether cross-country or downhill skiing competitions survive, Schumacher told me. But the consequences are bigger than just competitive and recreational snow sports having shorter seasons of poorer quality or becoming more exclusive. A lack of snow is also about critical watersheds that are strained when snow doesn’t fall in the mountains, leaving ecosystems damaged and agriculture unirrigated. Heck, it’s about hardy, stoic Minnesotans losing what it means to be hardy, stoic Minnesotans. “What they should care about,” Schumacher said of his fellow Americans, “is the effects of climate change that come after the death of snow sport as we know it.”
Mankin told me something similar. “What happens in winter,” he warned, “doesn’t stay in winter.”
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
It’s aware of the problem. That doesn’t make it easier to solve.
The data center backlash has metastasized into a full-blown PR crisis, one the tech sector is trying to get out in front of. But it is unclear whether companies are responding effectively enough to avoid a cascading series of local bans and restrictions nationwide.
Our numbers don’t lie: At least 25 data center projects were canceled last year, and nearly 100 projects faced at least some form of opposition, according to Heatmap Pro data. We’ve also recorded more than 60 towns, cities and counties that have enacted some form of moratorium or restrictive ordinance against data center development. We expect these numbers to rise throughout the year, and it won’t be long before the data on data center opposition is rivaling the figures on total wind or solar projects fought in the United States.
I spent this week reviewing the primary motivations for conflict in these numerous data center fights and speaking with representatives of the data center sector and relevant connected enterprises, like electrical manufacturing. I am now convinced that the industry knows it has a profound challenge on its hands. Folks are doing a lot to address it, from good-neighbor promises to lobbying efforts at the state and federal level. But much more work will need to be done to avoid repeating mistakes that have bedeviled other industries that face similar land use backlash cycles, such as fossil fuel extraction, mining, and renewable energy infrastructure development.
Two primary issues undergird the data center mega-backlash we’re seeing today: energy use fears and water consumption confusion.
Starting with energy, it’s important to say that data center development currently correlates with higher electricity rates in areas where projects are being built, but the industry challenges the presumption that it is solely responsible for that phenomenon. In the eyes of opponents, utilities are scrambling to construct new power supplies to meet projected increases in energy demand, and this in turn is sending bills higher.
That’s because, as I’ve previously explained, data centers are getting power in two ways: off the existing regional electric grid or from on-site generation, either from larger new facilities (like new gas plants or solar farms) or diesel generators for baseload, backup purposes. But building new power infrastructure on site takes time, and speed is the name of the game right now in the AI race, so many simply attach to the existing grid.
Areas with rising electricity bills are more likely to ban or restrict data center development. Let’s just take one example: Aurora, Illinois, a suburb of Chicago and the second most-populous city in the state. Aurora instituted a 180-day moratorium on data center development last fall after receiving numerous complaints about data centers from residents, including a litany related to electricity bills. More than 1.5 gigawatts of data center capacity already operate in the surrounding Kane County, where residential electricity rates are at a three-year high and expected to increase over the near term – contributing to a high risk of opposition against new projects.
The second trouble spot is water, which data centers need to cool down their servers. Project developers have face a huge hurdle in the form of viral stories of households near data centers who suddenly lack a drop to drink. Prominent examples activists bring up include this tale of a family living next to a Meta facility in Newton County, Georgia, and this narrative of people living around an Amazon Web Services center in St. Joseph County, Indiana. Unsurprisingly, the St. Joseph County Council rejected a new data center in response to, among other things, very vocal water concerns. (It’s worth noting that the actual harm caused to water systems by data centers is at times both over- and under-stated, depending on the facility and location.)
“I think it’s very important for the industry as a whole to be honest that living next to [a data center] is not an ideal situation,” said Caleb Max, CEO of the National Artificial Intelligence Association, a new D.C.-based trade group launched last year that represents Oracle and myriad AI companies.
Polling shows that data centers are less popular than the use of artificial intelligence overall, Max told me, so more needs to be done to communicate the benefits that come from their development – including empowering AI. “The best thing the industry could start to do is, for the people in these zip codes with the data centers, those people need to more tangibly feel the benefits of it.”
Many in the data center development space are responding quickly to these concerns. Companies are clearly trying to get out ahead on energy, with the biggest example arriving this week from Microsoft, which pledged to pay more for the electricity it uses to power its data centers. “It’s about balancing that demand and market with these concerns. That’s why you're seeing the industry lean in on these issues and more proactively communicating with communities,” said Dan Diorio, state policy director for the Data Center Coalition.
There’s also an effort underway to develop national guidance for data centers led by the National Electrical Manufacturers Association, the American Society of Heating, Refrigerating, and Air-Conditioning Engineers, and the Pacific Northwest National Laboratory, expected to surface publicly by this summer. Some of the guidance has already been published, such as this document on energy storage best practices, which is intended to help data centers know how to properly use solutions that can avoid diesel generators, an environmental concern in communities. But the guidance will ultimately include discussions of cooling, too, which can be a water-intensive practice.
“It’s a great example of an instance where industry is coming together and realizing there’s a need for guidance. There’s a very rapidly developing sector here that uses electricity in a fundamentally different way, that’s almost unprecedented,” Patrick Hughes, senior vice president of strategy, technical, and industry affairs for NEMA, told me in an interview Monday.
Personally, I’m unsure whether these voluntary efforts will be enough to assuage the concerns of local officials. It certainly isn’t convincing folks like Jon Green, a member of the Board of Supervisors in Johnson County, Iowa. Johnson County is a populous area, home to the University of Iowa campus, and Green told me that to date it hasn’t really gotten any interest from data center developers. But that didn’t stop the county from instituting a one-year moratorium in 2025 to block projects and give time for them to develop regulations.
I asked Green if there’s a form of responsible data center development. “I don’t know if there is, at least where they’re going to be economically feasible,” he told me. “If we say they’ve got to erect 40 wind turbines and 160 acres of solar in order to power a data center, I don’t know if when they do their cost analysis that it’ll pencil out.”
Plus a storage success near Springfield, Massachusetts, and more of the week’s biggest renewables fights.
1. Sacramento County, California – A large solar farm might go belly-up thanks to a fickle utility and fears of damage to old growth trees.
2. Hampden County, Massachusetts – The small Commonwealth city of Agawam, just outside of Springfield, is the latest site of a Massachusetts uproar over battery storage…
3. Washtenaw County, Michigan – The city of Saline southwest of Detroit is now banning data centers for at least a year – and also drafting regulations around renewable energy.
4. Dane County, Wisconsin – Another city with a fresh data center moratorium this week: Madison, home of the Wisconsin Badgers.
5. Hood County, Texas – Last but not least, I bring you one final stop on the apparent data center damnation tour: Hood County, south of the Texas city of Fort Worth.
A conversation with San Jose State University researcher Ivan Aiello, who’s been studying the aftermath of the catastrophe at Moss Landing.
This week’s conversation is with Ivano Aiello, a geoscientist at San Jose State University in California. I interviewed Aiello a year ago, when I began investigating the potential harm caused by the battery fire at Vistra’s Moss Landing facility, perhaps the largest battery storage fire of all time. The now-closed battery plant is located near the university, and Aiello happened to be studying a nearby estuary and wildlife habitat when the fire took place. He was therefore able to closely track metals contamination from the site. When we last spoke, he told me that he was working on a comprehensive, peer-reviewed study of the impacts of the fire.
That research was recently published and has a crucial lesson: We might not be tracking the environmental impacts of battery storage fires properly.
The following conversation was lightly edited for clarity.
Alright let’s start from the top – please tell my readers what your study ultimately found.
The bottom line is that we detected deposition of fine airborne particles, cathode material – nickel, manganese, and cobalt – in the area surrounding the battery storage facility. We found those particles right after the fire, immediately detected them in the field, sampled the soils, and found visible presence of those particles using different techniques. We kept measuring the location in the field over several months after the fire.
The critical thing is, we had baseline data. We had been surveying those areas for much longer before the fire. Those metals were in much higher concentration than they were before, and they were clearly related to the batteries. You can see that. And we were able to see changes in surface concentrations in the soils over time, including from weather – once the rains started, there was a significant decrease in concentrations of the metals, potentially related to runoff. Some of them migrated to the soil.
What we also noticed is that the protocols that have been used to look at soil contamination call for a surface sample of 3 inches. If your sample thickness is that and the layer of metal deposit is 1 millimeter or 5 millimeter, you’re not going to see anything. If you use standard protocols, you’re not going to find anything.
What does that mean for testing areas around big battery storage fires?
That’s exactly what I hope this work helps with. Procedures designed in the past are for different types of disasters and incidents which are more like landslides than ash fallout from a fire. These metal particles are a few microns thick, so they slide easily away.
It means we have to rethink how we go about measuring contamination after industrial fires and, yes, battery fires. Because otherwise it’s just completely useless – you’re diluting everything.
The other thing we learned is that ashfall deposits are very patchy. You can get different samples between a few feet and find huge differences. You can’t just go out there and take three samples in three places, you have to sample at a much higher resolution because otherwise you’ll miss the whole story.
When it comes to the takeaways from this study, what exactly do you think the lessons should be for the battery companies and regulators involved?
There are a lot of lessons we learned from this fire. The first is that having baseline data around a potential fire site is important because then you can better understand the after.
Then, the main way to assess the potential hazards during the fire and after the fire are air quality measurements. That doesn’t tell you what’s in the air. You could have a high concentration of pollen, and then you know the quality of the air, but if you replace that with metal it is different. It’s not just how much you’re breathing, but what you are breathing.
Also, fast response. [Vistra] just released a report on soil saying there was nothing … but the sampling was done eight months after the fire. Our study shows after the fire you have this pulse of dust, and then it moves. Stuff moves to soil, across habitat. So if you don’t go out there right away, you might miss the whole thing.
Finally, what we found was that the fallout from the fire was not a bullseye pattern centered at the facility but rather offset kilometers away because of the wind.
We didn’t know much about this before because we didn’t have a real case study. This is the first real live event in which we can actually see the effects of a large battery burning.