You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
What happens when you can’t run and you can’t hide?
You did everything right.
You had your go-bag ready and you knew your evacuation route. You monitored the wildfire as it moved closer and closer to your home, and you kept the volume turned up on your phone so you could heed a “LEAVE NOW” notice if one came. When it finally does, jolting you awake in the middle of the night, you realize that you can smell the smoke inside. When did the fire get so close?
The power is out, so you make your way downstairs using your phone’s flashlight. You have to Google how to manually open the garage door since the electronic clicker doesn’t work (oh, so that’s what the red cord is for). Your heart is thumping, but you’ve made it, you’re in your car; you even remembered to keep it filled to half a tank in preparation. You pull out of your driveway and onto the dirt road that leads out of your rural neighborhood. The night sky ahead of you is a weird neon orange.
You have to hit your brakes when you reach the intersection at the main road. It’s completely backed up with other evacuees, their red taillights stretching ahead through the thickening smoke as far as your eye can see. Some of your neighbors are pulling their boats on trailers; there is an RV up ahead. And you can see the fire burning down the side of the hill now — toward you, toward the gridlocked traffic that isn’t moving.
Harrowing Fort McMurray wildfire escapeyoutu.be
Leaving your home is only the beginning of a wildfire evacuation. But the next step — the drive to a safe location — is usually given no more attention in preparedness guides than the reminder to “follow the directions of emergency officials.” In the best-case scenarios, where communication is clear and early and residents are prepared, that might be enough. But when communication breaks down, or fires move fast and unpredictably, traffic can reach a dangerous standstill and familiar roads can transform into death traps.
In 2015, some 20 vehicles were overcome by a fire while stuck in a traffic jam on Interstate 15 between Los Angeles and Las Vegas; on the same interstate in Utah five years later, a backup nearly became deadly as a fire burned up to the road’s shoulder and panicked travelers abandoned their cars. Fire evacuations in New South Wales, Australia, in 2020 resulted in a 10-hour backup, and Canada’s Highway 3 had bumper-to-bumper traffic earlier this month because it was the only road out of imperiled Yellowknife. In 2020, some 200 people had to be evacuated by helicopter from California’s Sierra National Forest after a fire cut off their only exit route.
And when people die in wildfires, they are often found in their vehicles. In Portugal, 47 of the 64 people killed during a 2017 forest fire were in their cars, trying to escape. At least 10 people were found dead in or near their cars after the 2018 Camp fire, the deadliest blaze in California’s history. And in Lahaina, Hawaii, this month, in what the Los Angeles Times has called “surely … the deadliest traffic jam in U.S. history,” the lack of advanced warning combined with inexplicably blocked roads led an untold number of people to perish in their cars while trying to evacuate, including a 7-year-old boy who was fleeing with his family; a man who used his last moments attempting to shield a beloved golden retriever in his hatchback; and a couple who were reportedly found in each other’s arms.
In a best-case scenario, emergency managers are able to phase evacuations in such a way that the roads don’t get backed up and residents have plenty of time to make it to safety. But wildfire is anything but predictable, and officials who call for an evacuation too soon can risk skeptical residents deciding to take a “wait and see” approach, where they only get in their car once things start to look dicey. In one 2017 study, only a quarter of people in wildfire-prone neighborhoods actually left as soon as they received an evacuation notice (other studies have found higher levels of compliance). This is the worst nightmare from an emergency management standpoint, since “evacuating at the last minute is probably the most dangerous thing you can do,” Sarah McCaffrey, one of the 2017 study’s authors, told The New Yorker.
Further complicating matters is the fact that many wildfire-prone areas are isolated or rural regions with a limited number of egresses to work with. One 2019 investigation found that in California alone, 350,000 people live in areas “that have both the highest wildfire risk designation, and either the same number or fewer exit routes per person as Paradise” — the site of the 2018 Camp fire, where backups on roads prevented many from escaping.
Evacuation traffic also doesn’t behave like the rush hour traffic we’re more familiar with. It’s “a peak of a peak,” with the congestion caused by “the sheer amount of people trying to leave and load onto the roadway at the same time in the same direction,” Stephen Wong, a wildfire evacuation researcher and an assistant professor of transportation engineering at the University of Alberta, told me. Burnovers and hazards like downed powerlines or trees can further reduce exit options, funneling all evacuees onto the same low-capacity roads. Worse, once that congestion starts to form, “you actually reduce the number of vehicles being able to go through that section,” Wong added. “So you go from 2,000 vehicles per hour [per lane], and it drops to, like, 500 vehicles per hour.”
Get one great climate story in your inbox every day:
Households will also frequently evacuate with multiple cars — rather than leave a valuable asset behind to burn — and tow trailers, boats, and RVs. As a result, the average vehicle length increases by 3% during wildfire evacuations, one recent study that looked at the 2019 Kincade fire in California found — leading, of course, to even worse congestion. (Agonizingly, Wong’s research further uncovered that over half of evacuating households “had at least two or more spare seats available”). The Kincade study also discovered that drivers significantly slow down during wildfire evacuations — contrary to the common misconception of careening, panicked escapees — likely due to a combination of factors such as lowered visibility and more cautious driving.
Because “most [evacuation] research focuses on hurricanes and then tornadoes,” Salman Ahmad, a traffic engineer at the civil engineering firm Fleis & VandenBrink, told me, “traffic simulations — how traffic moves during a wildfire — are still lacking.” When emergency planners use computer models to calculate minimum evacuation times for their jurisdictions, for example, their assumptions can be deadly. “If you plan for an allocation considering normal traffic as a benchmark, you’re basically not making the right assumption because you need to put in that extra safety margin” to account for “the fact that people slow down,” Enrico Ronchi, a fire researcher at Lund University in Sweden and the author of the Kincade study, told me.
Wong agreed, stressing that the number of variables fire managers need to juggle is dizzying. “Evacuations are really complex events that involve human behavior, risk perceptions, communication, emergency management, operations, the transportation system itself, psychology, the built environment, and biophysical fire,” Wong said. “So we have a long way to go for evidence-based and sufficient planning that can actually operationalize and prepare communities for these types of events.”
And that’s the scary thing: A person or a community might do everything right and still be at grave risk because of all the unknowns. Evacuation alerts might not get sent or arrive too late; exit routes might become unexpectedly blocked; fires might leapfrog, via flying embers, to create new spot fires that cut off egresses. Paradise, California, famously had a phased evacuation plan in place and had even run community wildfire drills, but even the best-laid plans can unravel.
Tom Cova, a geography professor at the University of Utah who has been studying wildfire evacuations for 30 years, told me that “too many communities may be planning for the roads to be open, the wireless emergency alert systems to work, there not to be tons of kids at home that day — you can just go down the list of things that [could go] wrong and think, What’s the backup plan?” The uncomfortable truth is that we need plans B, C, and D for when evacuations fail. Because they will fail.
Take Lahaina, where a closed bypass road concentrated outbound traffic onto a single, jam-packed street. When people started to panic and abandon their cars, it ultimately further obstructed the road for everyone behind them. “It’s like a chain reaction, where each car is seeing the [people in the] car in front of them run,” Cova said. “And then you look behind you, you can’t back up. If you look to the sides, you’re stuck. And then you say, ‘We’re going into the ocean, too.’”
That improvisation ultimately saved some lives. But “it’s hard for emergency managers to order this kind of thing because what if people drowned?” Cova went on. “So you’re trading one risk for another risk.”
But the need for creative improvisation is also a conclusion that’s been reached by the National Institute of Standards and Technology (NIST), the government agency tasked with issuing guidelines and regulations for engineers and emergency responders. In new guidance released last week, NIST used the Camp fire as its case study and found “evacuation is not a universal solution,” explaining there are times when “it may be better for residents to shelter in their community at a designated safety zone” rather than attempt to drive out of town.
This is a somewhat radical position for a U.S. agency since evacuations have long been the foundation of American wildfire preparations. But the thinking now appears to be turning toward asking “what shelters do we have?” if and when a worst-case scenario arises, as Cova further explained to me. “Temporary refuge areas, high schools, churches, large parking lots, large sports fields, golf courses, swimming pools — I wouldn’t recommend using any of these things, and I wouldn’t recommend people being told to use them,” he said, “but [people] have to know what to do when they can’t get out.”
In the case of Paradise, for example, NIST reports that there were 31 such “temporary refuge areas” that ultimately saved 1,200 lives during the fire, including 14 parking lots, seven roadways, six structures, and a handful of defensible natural areas, like a pre-established wildfire assembly area in a meadow that had already burned and ended up serving as a refuge for as many as 85 people. Once established, these concentrated refuge areas can be defended by firefighters, as was the case for 150 people who memorably hunkered down to wait out the blaze in a strip mall parking lot. It’s far from a best-case scenario, but that’s still 150 people who would’ve otherwise been stuck in potentially deadly traffic jams trying to get out of town.
Temporary refuges are unplanned areas of last resort, but establishing a larger safety zone network and preemptively hardening gathering places like schools and community centers could also potentially reduce exposure on roads by shortening the distance evacuees need to travel to get to lower-hazard areas. So-called WUI fire shelters — essentially, personal fire bunkers that NIST warns against because they aren’t standardized in the U.S. but are popular in Australia — could also be explored. “That’s the direction we’re heading in with wildfire communities,” Cova told me grimly, “because we don’t seem to be able to stop the development in these areas. That means we’re forcing people into a corner where shelter is their only backup plan.”
Maybe this is difficult for you to imagine: Your community is different; a wildfire couldn’t happen here. You’d evacuate as soon as you got the notice; there’s no way you’d get stuck. You’re a good driver; you could get out without help. But as Lahaina and other “unprecedented” fires show, it’s the limits of our lived experiences that we’re up against now.
“We should think about possible scenarios that we have not seen before in our communities,” Ronchi, the Swedish fire researcher, said. “I understand that it’s a bit of a challenge for everyone because often you have to invest money for something that you have not experienced directly. But we are [living] in scenarios now in which we cannot anchor ourselves on our past experiences only.”
Read more about wildfires:
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Amarillo-area residents successfully beat back a $600 million project from Xcel Energy that would have provided useful tax revenue.
Power giant Xcel Energy just suffered a major public relations flap in the Texas Panhandle, scrubbing plans for a solar project amidst harsh backlash from local residents.
On Friday, Xcel Energy withdrew plans to build a $600 million solar project right outside of Rolling Hills, a small, relatively isolated residential neighborhood just north of the city of Amarillo, Texas. The project was part of several solar farms it had proposed to the Texas Public Utilities Commission to meet the load growth created by the state’s AI data center boom. As we’ve covered in The Fight, Texas should’ve been an easier place to do this, and there were few if any legal obstacles standing in the way of the project, dubbed Oneida 2. It was sited on private lands, and Texas counties lack the sort of authority to veto projects you’re used to seeing in, say, Ohio or California.
But a full-on revolt from homeowners and realtors apparently created a public relations crisis.
Mere weeks ago, shortly after word of the project made its way through the small community that is Rolling Hills, more than 60 complaints were filed to the Texas Public Utilities Commission in protest. When Xcel organized a public forum to try and educate the public about the project’s potential benefits, at least 150 residents turned out, overwhelmingly to oppose its construction. This led the Minnesota-based power company to say it would scrap the project entirely.
Xcel has tried to put a happy face on the situation. “We are grateful that so many people from the Rolling Hills neighborhood shared their concerns about this project because it gives us an opportunity to better serve our communities,” the company said in a statement to me. “Moving forward, we will ask for regulatory approval to build more generation sources to meet the needs of our growing economy, but we are taking the lessons from this project seriously.”
But what lessons, exactly, could Xcel have learned? What seems to have happened is that it simply tried to put a solar project in the wrong place, prizing convenience and proximity to an existing electrical grid over the risk of backlash in an area with a conservative, older population that is resistant to change.
Just ask John Coffee, one of the commissioners for Potter County, which includes Amarillo, Rolling Hills, and a lot of characteristically barren Texas landscape. As he told me over the phone this week, this solar farm would’ve been the first utility-scale project in the county. For years, he said, renewable energy developers have explored potentially building a project in the area. He’s entertained those conversations for two big reasons – the potential tax revenue benefits he’s seen elsewhere in Texas; and because ordinarily, a project like Oneida 2 would’ve been welcomed in any of the pockets of brush and plain where people don’t actually live.
“We’re struggling with tax rates and increases and stuff. In the proper location, it would be well-received,” he told me. “The issue is, it’s right next to a residential area.”
Indeed, Oneida 2 would’ve been smack dab up against Rolling Hills, occupying what project maps show would be the land surrounding the neighborhood’s southeast perimeter – truly the sort of encompassing adjacency that anti-solar advocates like to describe as a bogeyman.
Cotton also told me he wasn’t notified about the project’s existence until a few weeks ago, at the same time resident complaints began to reach a fever pitch. He recalled hearing from homeowners who were worried that they’d no longer be able to sell their properties. When I asked him if there was any data backing up the solar farm’s potential damage to home prices, he said he didn’t have hard numbers, but that the concerns he heard directly from the head of Amarillo’s Realtors Association should be evidence enough.
Many of the complaints against Oneida 2 were the sort of stuff we’re used to at The Fight, including fears of fires and stormwater runoff. But Cotton said it really boiled down to property values – and the likelihood that the solar farm would change the cultural fabric in Rolling Hills.
“This is a rural area. There are about 300 homes out there. Everybody sitting out there has half an acre, an acre, two acres, and they like to enjoy the quiet, look out their windows and doors, and see some distance,” he said.
Ironically, Cotton opposed the project on the urging of his constituents, but is now publicly asking Xcel to continue to develop solar in the county. “Hopefully they’ll look at other areas in Potter County,” he told me, adding that at least one resident has already come to him with potential properties the company could acquire. “We could really use the tax money from it. But you just can’t harm a community for tax dollars. That’s not what I’m about.”
I asked Xcel how all this happened and what their plans are next. A spokesperson repeatedly denied my requests to discuss Oneida 2 in any capacity. In a statement, the company told me it “will provide updates if the project is moved to another site,” and that “the company will continue to evaluate whether there is another location within Potter County, or elsewhere, to locate the solar project.”
Meanwhile, Amarillo may be about to welcome data center development because of course, and there’s speculation the first AI Stargate facility may be sited near Amarillo, as well.
City officials will decide in the coming weeks on whether to finalize a key water agreement with a 5,600-acre private “hypergrid” project from Fermi America, a new company cofounded by former Texas governor Rick Perry, says will provide upwards of 11 gigawatts to help fuel artificial intelligence services. Fermi claims that at least 1 gigawatt of power will be available by the end of next year – a lot of power.
The company promises that its “hypergrid” AI campus will use on-site gas and nuclear generation, as well as contracted gas and solar capacity. One thing’s for sure – it definitely won’t be benefiting from a large solar farm nearby anytime soon.
And more of the most important news about renewable projects fighting it out this week.
1. Racine County, Wisconsin – Microsoft is scrapping plans for a data center after fierce opposition from a host community in Wisconsin.
2. Rockingham County, Virginia – Another day, another chokepoint in Dominion Energy’s effort to build more solar energy to power surging load growth in the state, this time in the quaint town of Timberville.
3. Clark County, Ohio – This county is one step closer to its first utility-scale solar project, despite the local government restricting development of new projects.
4. Coles County, Illinois – Speaking of good news, this county reaffirmed the special use permit for Earthrise Energy’s Glacier Moraine solar project, rebuffing loud criticisms from surrounding households.
5. Lee County, Mississippi – It’s full steam ahead for the Jugfork solar project in Mississippi, a Competitive Power Ventures proposal that is expected to feed electricity to the Tennessee Valley Authority.
A conversation with Enchanted Rock’s Joel Yu.
This week’s chat was with Joel Yu, senior vice president for policy and external affairs at the data center micro-grid services company Enchanted Rock. Now, Enchanted Rock does work I usually don’t elevate in The Fight – gas-power tracking – but I wanted to talk to him about how conflicts over renewable energy are affecting his business, too. You see, when you talk to solar or wind developers about the potential downsides in this difficult economic environment, they’re willing to be candid … but only to a certain extent. As I expected, someone like Yu who is separated enough from the heartburn that is the Trump administration’s anti-renewables agenda was able to give me a sober truth: Land use and conflicts over siting are going to advantage fossil fuels in at least some cases.
The following conversation was lightly edited for clarity.
Help me understand where, from your perspective, the generation for new data centers is going to come from. I know there are gas turbine shortages, but also that solar and wind are dealing with headwinds in the United States given cuts to the Inflation Reduction Act.
There are a lot of stories out there about certain technologies coming out to the forefront to solve the problem, whether it’s gas generation or something else. But the scale and the scope of this stuff … I don’t think there is a silver bullet where it’s all going to come from one place.
The Energy Department put out a request for information looking for ways to get to 3 gigawatts quickly, but I don’t think there is any way to do that quickly in the United States. It’s going to take work from generation developers, batteries, thermal generation, emerging storage technologies, and transmission. Reality is, whether it is supply chain issues or technology readiness or the grid’s readiness to accept that load generation profile, none of it is ready. We need investment and innovation on all fronts.
How do conflicts over siting play into solving the data center power problem? Like, how much of the generation that we need for data center development is being held back by those fights?
I do have an intuitive sense that the local siting and permitting concerns around data centers are expanding in scope from the normal noise and water considerations to include impacts to energy affordability and reliability, as well as the selection of certain generation technologies. We’ve seen diesel generation, for example, come into the spotlight. It’s had to do with data center permitting in certain jurisdictions, in places like Maryland and Minnesota. Folks are realizing that a data center comes with a big power plant – their diesel generation. When other power sources fall short, they’ll rely on their diesel more frequently, so folks are raising red flags there. Then, with respect to gas turbines or large cycle units, there’s concerns about viewsheds, noise and cooling requirements, on top of water usage.
How many data center projects are getting their generation on-site versus through the grid today?
Very few are using on-site generation today. There’s a lot of talk about it and interest, but in order to serve our traditional cloud services data center or AI-type loads, they’re looking for really high availability rates. That’s really costly and really difficult to do if you’re off the grid and being serviced by on-site generation.
In the context of policy discussions, co-location has primarily meant baseload resources on sites that are serving the data centers 24/7 – the big stories behind Three Mile Island and the Susquehanna nuclear plant. But to be fair, most data centers operational today have on-site generation. That’s their diesel backup, what backstops the grid reliability.
I think where you’re seeing innovation is modular gas storage technologies and battery storage technologies that try to come in and take the space of the diesel generation that is the standard today, increasing the capability of data centers in terms of on-site power relative to status quo. Renewable power for data centers at scale – talking about hundreds of megawatts at a time – I think land is constraining.
If a data center is looking to scale up and play a balancing act of competing capacity versus land for energy production, the competing capacity is extremely valuable. They’re going to prioritize that first and pack as much as they can into whatever land they have to develop. Data centers trying to procure zero-carbon energy are primarily focused on getting that energy over wires. Grid connection, transmission service for large-scale renewables that can match the scale of natural gas, there’s still very strong demand to stay connected to the grid for reliability and sustainability.
Have you seen the state of conflict around renewable energy development impact data center development?
Not necessarily. There is an opportunity for data center development to coincide with renewable project development from a siting perspective, if they’re going to be co-located or near to each other in remote areas. For some of these multi-gigawatt data centers, the reason they’re out in the middle of nowhere is a combination of favorable permitting and siting conditions for thousands of acres of data center building, substations and transmission –
Sorry, but even for projects not siting generation, if megawatts – if not gigawatts – are held up from coming to the grid over local conflicts, do you think that’s going to impact data center development at all? The affordability conversions? The environmental ones?
Oh yeah, I think so. In the big picture, the concern is if you can integrate large loads reliably and affordably. Governors, state lawmakers are thinking about this, and it’s bubbling up to the federal level. You need a broad set of resources on the grid to provide that adequacy. To the extent you hold up any grid resources, renewable or otherwise, you’re going to be staring down some serious challenges in serving the load. Virginia’s a good example, where local groups have held up large-scale renewable projects in the state, and Dominion’s trying to build a gas peaker plant that’s being debated, too. But in the meantime, it is Data Center Alley, and there are gigawatts of data centers that continue to want to get in and get online as quickly as possible. But the resources to serve that load are not coming online in time.
The push toward co-location probably does favor thermal generation and battery storage technologies over straight renewable energy resources. But a battery can’t cover 24/7 use cases for a data center, and neither will our unit. We’re positioned to be a bridge resource for 24/7 use for a few years until they can get more power to the market, and then we can be a flexible backup resource – not a replacement for the large-scale and transmission-connected baseload power resources, like solar and wind. Texas has benefited from huge deployments of solar and wind. That has trickled down to lower electricity costs. Those resources can’t do it alone, and there’s thermal to balance the system, but you need it all to meet the load growth.