You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
It’s flawed, but not worthless. Here’s how you should think about it.
Starting this month, the tens of millions of Americans who browse the real-estate listings website Zillow will encounter a new type of information.
In addition to disclosing a home’s square footage, school district, and walkability score, Zillow will begin to tell users about its climate risk — the chance that a major weather or climate event will strike in the next 30 years. It will focus on the risk from five types of dangers: floods, wildfires, high winds, heat, and air quality.
The data has the potential to transform how Americans think about buying a home, especially because climate change will likely worsen many of those dangers. About 70% of Americans look at Zillow at some point during the process of buying a home, according to the company.
“Climate risks are now a critical factor in home-buying decisions,” Skylar Olsen, Zillow’s chief economist, said in a statement. “Healthy markets are ones where buyers and sellers have access to all relevant data for their decisions.”
That’s true — if the information is accurate. But can homebuyers actually trust Zillow’s climate risk data? When climate experts have looked closely at the underlying data Zillow uses to assess climate risk, they have walked away unconvinced.
Zillow’s climate risk data comes from First Street Technology, a New York-based company that uses computer models to estimate the risk that weather and climate change pose to homes and buildings. It is far and away the most prominent company focused on modeling the physical risks of climate change. (Although it was initially established as a nonprofit foundation, First Street reorganized as a for-profit company and accepted $46 million in investment earlier this year.)
But few experts believe that tools like First Street’s are capable of actually modeling the dangers of climate change at a property-by-property level. A report from a team of White House scientific advisors concluded last year that these models are of “questionable quality,” and a Bloomberg investigation found that different climate risk models could return wildly different catastrophe estimates for the same property.
Courtesy of Zillow
Not all of First Street’s data is seen as equally suspect. Its estimates of heat and air pollution risk have generally attracted less criticism from experts. But its estimates of flooding and wildfire risk — which are the most catastrophic events for homeowners — are generally thought to be inadequate at best.
So while Zillow will soon tell you with seeming precision that a certain home has a 1.1% chance of facing a wildfire in the next 30 years, potential homebuyers should take that kind of estimate with “a lot of grains of salt,” Michael Wara, a senior research scholar at the Stanford Woods Institute for the Environment, told me.
Here’s a short guide for how to think through Zillow’s estimates of climate risk.
Neither First Street nor Zillow immediately responded to requests for comment.
Zillow has said that, when the data is available, it will tell users whether a given home has flooded or burned in a wildfire recently. (It will also say whether a home is near a source of air pollution.)
Homebuyers should take that information seriously, Madison Condon, a Boston University School of Law professor who studies climate change and financial markets, told me.
“If the house flooded in the recent past, then that should be a major red flag to you,” she said. Houses that have flooded recently are very likely to flood again, she said. Only 10 states require a home seller to disclose a flood to a potential buyer.
First Street claims that its physics-based models can identify the risk that any individual property will flood. But the ability to determine whether a given house will flood depends on having an intricate knowledge of local infrastructure, including stormwater drains and what exists on other properties, and that data does not seem to exist in anyone’s model at the moment, Condon said.
When Bloomberg compared the output of three different flooding models, including First Street’s, they agreed on results for only 5% of properties.
If you’re worried about a home’s flood risk, then contact the local government and see if you can look at a flood map or even talk to a flood manager, Condon said. Many towns and cities keep flood maps in their records or on their website that are more granular than what First Street is capable of, she said.
“The local flood manager who has walked the property will almost always have a better grasp of flood risk than the big, top-down national model,” she said.
In some cases, Zillow will recommend that a home buyer purchase federal flood insurance. That’s generally not a bad idea, Condon said, even if Zillow reaches that conclusion using national model data that has errors or mistakes.
“It simply is true that way more people should be buying flood insurance than generally think they should,” she said. “So a general overcorrection on that would be good.”
If you’re looking at buying a home in a wildfire-prone area, especially in the American West, then you should generally assume that Zillow is underestimating its wildfire risk, Wara, the Stanford researcher, told me.
That’s because computer models that estimate wildfire risk are in a fairly early stage of development and improving rapidly. Even the best academic simulations lack the kind of granular, structure-level data that would allow them to predict a property’s forward-looking wildfire risk.
That is actually a bigger problem for homebuyers than for insurance companies, he said. A home insurance company gets to decide whether to insure a property every year. If it looks at new science and concludes that a given town or structure is too risky, then it can raise its premiums or even simply decline to cover a property at all. (State Farm stopped selling home insurance policies in California last year, partly because of wildfire risk.)
But when homeowners buy a house, their lives and their wealth get locked into that property for 30 years. “Maybe your kids are going to the school district,” he said. It’s much harder to sell a home when you can’t get it covered. “You have an illiquid asset, and it’s a lot harder to move.”
That means First Street’s wildfire risk data should be taken as “absolute minimum estimate,” Wara said. In a wildfire-prone area, “the real risk is most likely much higher” than its models say.
Over the past several years, runaway wildland fires have killed dozens of people or destroyed tens of thousands of homes in Lahaina, Hawaii; Paradise, California; and Marshall, Colorado.
But in those cases, once the fire began incinerating homes, it ceased to be a wildland fire and became a structure-to-structure fire. The fire began to leap from house to house like a book of matches, condemning entire neighborhoods to burn within minutes.
Modern computer models do an especially poor job of simulating that transition — the moment when a wildland fire becomes an urban conflagration, Wara said. Although it only happens in perhaps 0.5% of the most intense fires, those fires are responsible for destroying the most homes.
But “how that happens and how to prevent that is not well understood yet,” he said. “And if they’re not well understood yet from a scientific perspective, that means it’s not in the [First Street] model.”
Nor do the best university wildfire models have good data on every individual property’s structural-level details — such as what material its walls or roof are made of — that would make it susceptible to fire.
When assessing whether your home faces wildfire risk, its structure is very important. But “you have to know what your neighbor’s houses look like, too, within about a 250-yard radius. So that’s your whole neighborhood,” Wara said. “I don’t think anyone has that data.”
A similar principle goes for thinking about flood risk, Condon said. Your home might not flood, she said, but it also matters whether the roads to your house are still driveable or whether the power lines fail. “It’s not particularly useful to have a flood-resilient home if your whole neighborhood gets washed out,” she said.
Experts agree that the most important interventions to discourage wildfire — or, for that matter, floods — have to happen at the community level. Although few communities are doing prescribed burns or fuel reduction programs right now, some are, Wara said.
But because nobody is collecting data about those programs, national risk models like First Street’s would not factor those programs into an area’s wildfire risk, he said. (In the rare case that a government is clearing fuel or doing a prescribed burn around a town, wildfire risk there might actually be lower than Zillow says, Wara added.)
Going forward, figuring out a property’s climate risk — much like pushing for community-level resilience investment — shouldn’t be left up to individuals, Condon said.
The state of California is investing in a public wildfire catastrophe model so that it can figure out which homes and towns face the highest risk. She said that Fannie Mae and Freddie Mac, the federal entities that buy home mortgages, could invest in their own internal climate-risk assessments to build the public’s capacity to understand climate risk.
“I would advocate for this not to be an every-man-for-himself, every-consumer-has-to-make-a-decision situation,” Condon said.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Amarillo-area residents successfully beat back a $600 million project from Xcel Energy that would have provided useful tax revenue.
Power giant Xcel Energy just suffered a major public relations flap in the Texas Panhandle, scrubbing plans for a solar project amidst harsh backlash from local residents.
On Friday, Xcel Energy withdrew plans to build a $600 million solar project right outside of Rolling Hills, a small, relatively isolated residential neighborhood just north of the city of Amarillo, Texas. The project was part of several solar farms it had proposed to the Texas Public Utilities Commission to meet the load growth created by the state’s AI data center boom. As we’ve covered in The Fight, Texas should’ve been an easier place to do this, and there were few if any legal obstacles standing in the way of the project, dubbed Oneida 2. It was sited on private lands, and Texas counties lack the sort of authority to veto projects you’re used to seeing in, say, Ohio or California.
But a full-on revolt from homeowners and realtors apparently created a public relations crisis.
Mere weeks ago, shortly after word of the project made its way through the small community that is Rolling Hills, more than 60 complaints were filed to the Texas Public Utilities Commission in protest. When Xcel organized a public forum to try and educate the public about the project’s potential benefits, at least 150 residents turned out, overwhelmingly to oppose its construction. This led the Minnesota-based power company to say it would scrap the project entirely.
Xcel has tried to put a happy face on the situation. “We are grateful that so many people from the Rolling Hills neighborhood shared their concerns about this project because it gives us an opportunity to better serve our communities,” the company said in a statement to me. “Moving forward, we will ask for regulatory approval to build more generation sources to meet the needs of our growing economy, but we are taking the lessons from this project seriously.”
But what lessons, exactly, could Xcel have learned? What seems to have happened is that it simply tried to put a solar project in the wrong place, prizing convenience and proximity to an existing electrical grid over the risk of backlash in an area with a conservative, older population that is resistant to change.
Just ask John Coffee, one of the commissioners for Potter County, which includes Amarillo, Rolling Hills, and a lot of characteristically barren Texas landscape. As he told me over the phone this week, this solar farm would’ve been the first utility-scale project in the county. For years, he said, renewable energy developers have explored potentially building a project in the area. He’s entertained those conversations for two big reasons – the potential tax revenue benefits he’s seen elsewhere in Texas; and because ordinarily, a project like Oneida 2 would’ve been welcomed in any of the pockets of brush and plain where people don’t actually live.
“We’re struggling with tax rates and increases and stuff. In the proper location, it would be well-received,” he told me. “The issue is, it’s right next to a residential area.”
Indeed, Oneida 2 would’ve been smack dab up against Rolling Hills, occupying what project maps show would be the land surrounding the neighborhood’s southeast perimeter – truly the sort of encompassing adjacency that anti-solar advocates like to describe as a bogeyman.
Cotton also told me he wasn’t notified about the project’s existence until a few weeks ago, at the same time resident complaints began to reach a fever pitch. He recalled hearing from homeowners who were worried that they’d no longer be able to sell their properties. When I asked him if there was any data backing up the solar farm’s potential damage to home prices, he said he didn’t have hard numbers, but that the concerns he heard directly from the head of Amarillo’s Realtors Association should be evidence enough.
Many of the complaints against Oneida 2 were the sort of stuff we’re used to at The Fight, including fears of fires and stormwater runoff. But Cotton said it really boiled down to property values – and the likelihood that the solar farm would change the cultural fabric in Rolling Hills.
“This is a rural area. There are about 300 homes out there. Everybody sitting out there has half an acre, an acre, two acres, and they like to enjoy the quiet, look out their windows and doors, and see some distance,” he said.
Ironically, Cotton opposed the project on the urging of his constituents, but is now publicly asking Xcel to continue to develop solar in the county. “Hopefully they’ll look at other areas in Potter County,” he told me, adding that at least one resident has already come to him with potential properties the company could acquire. “We could really use the tax money from it. But you just can’t harm a community for tax dollars. That’s not what I’m about.”
I asked Xcel how all this happened and what their plans are next. A spokesperson repeatedly denied my requests to discuss Oneida 2 in any capacity. In a statement, the company told me it “will provide updates if the project is moved to another site,” and that “the company will continue to evaluate whether there is another location within Potter County, or elsewhere, to locate the solar project.”
Meanwhile, Amarillo may be about to welcome data center development because of course, and there’s speculation the first AI Stargate facility may be sited near Amarillo, as well.
City officials will decide in the coming weeks on whether to finalize a key water agreement with a 5,600-acre private “hypergrid” project from Fermi America, a new company cofounded by former Texas governor Rick Perry, says will provide upwards of 11 gigawatts to help fuel artificial intelligence services. Fermi claims that at least 1 gigawatt of power will be available by the end of next year – a lot of power.
The company promises that its “hypergrid” AI campus will use on-site gas and nuclear generation, as well as contracted gas and solar capacity. One thing’s for sure – it definitely won’t be benefiting from a large solar farm nearby anytime soon.
And more of the most important news about renewable projects fighting it out this week.
1. Racine County, Wisconsin – Microsoft is scrapping plans for a data center after fierce opposition from a host community in Wisconsin.
2. Rockingham County, Virginia – Another day, another chokepoint in Dominion Energy’s effort to build more solar energy to power surging load growth in the state, this time in the quaint town of Timberville.
3. Clark County, Ohio – This county is one step closer to its first utility-scale solar project, despite the local government restricting development of new projects.
4. Coles County, Illinois – Speaking of good news, this county reaffirmed the special use permit for Earthrise Energy’s Glacier Moraine solar project, rebuffing loud criticisms from surrounding households.
5. Lee County, Mississippi – It’s full steam ahead for the Jugfork solar project in Mississippi, a Competitive Power Ventures proposal that is expected to feed electricity to the Tennessee Valley Authority.
A conversation with Enchanted Rock’s Joel Yu.
This week’s chat was with Joel Yu, senior vice president for policy and external affairs at the data center micro-grid services company Enchanted Rock. Now, Enchanted Rock does work I usually don’t elevate in The Fight – gas-power tracking – but I wanted to talk to him about how conflicts over renewable energy are affecting his business, too. You see, when you talk to solar or wind developers about the potential downsides in this difficult economic environment, they’re willing to be candid … but only to a certain extent. As I expected, someone like Yu who is separated enough from the heartburn that is the Trump administration’s anti-renewables agenda was able to give me a sober truth: Land use and conflicts over siting are going to advantage fossil fuels in at least some cases.
The following conversation was lightly edited for clarity.
Help me understand where, from your perspective, the generation for new data centers is going to come from. I know there are gas turbine shortages, but also that solar and wind are dealing with headwinds in the United States given cuts to the Inflation Reduction Act.
There are a lot of stories out there about certain technologies coming out to the forefront to solve the problem, whether it’s gas generation or something else. But the scale and the scope of this stuff … I don’t think there is a silver bullet where it’s all going to come from one place.
The Energy Department put out a request for information looking for ways to get to 3 gigawatts quickly, but I don’t think there is any way to do that quickly in the United States. It’s going to take work from generation developers, batteries, thermal generation, emerging storage technologies, and transmission. Reality is, whether it is supply chain issues or technology readiness or the grid’s readiness to accept that load generation profile, none of it is ready. We need investment and innovation on all fronts.
How do conflicts over siting play into solving the data center power problem? Like, how much of the generation that we need for data center development is being held back by those fights?
I do have an intuitive sense that the local siting and permitting concerns around data centers are expanding in scope from the normal noise and water considerations to include impacts to energy affordability and reliability, as well as the selection of certain generation technologies. We’ve seen diesel generation, for example, come into the spotlight. It’s had to do with data center permitting in certain jurisdictions, in places like Maryland and Minnesota. Folks are realizing that a data center comes with a big power plant – their diesel generation. When other power sources fall short, they’ll rely on their diesel more frequently, so folks are raising red flags there. Then, with respect to gas turbines or large cycle units, there’s concerns about viewsheds, noise and cooling requirements, on top of water usage.
How many data center projects are getting their generation on-site versus through the grid today?
Very few are using on-site generation today. There’s a lot of talk about it and interest, but in order to serve our traditional cloud services data center or AI-type loads, they’re looking for really high availability rates. That’s really costly and really difficult to do if you’re off the grid and being serviced by on-site generation.
In the context of policy discussions, co-location has primarily meant baseload resources on sites that are serving the data centers 24/7 – the big stories behind Three Mile Island and the Susquehanna nuclear plant. But to be fair, most data centers operational today have on-site generation. That’s their diesel backup, what backstops the grid reliability.
I think where you’re seeing innovation is modular gas storage technologies and battery storage technologies that try to come in and take the space of the diesel generation that is the standard today, increasing the capability of data centers in terms of on-site power relative to status quo. Renewable power for data centers at scale – talking about hundreds of megawatts at a time – I think land is constraining.
If a data center is looking to scale up and play a balancing act of competing capacity versus land for energy production, the competing capacity is extremely valuable. They’re going to prioritize that first and pack as much as they can into whatever land they have to develop. Data centers trying to procure zero-carbon energy are primarily focused on getting that energy over wires. Grid connection, transmission service for large-scale renewables that can match the scale of natural gas, there’s still very strong demand to stay connected to the grid for reliability and sustainability.
Have you seen the state of conflict around renewable energy development impact data center development?
Not necessarily. There is an opportunity for data center development to coincide with renewable project development from a siting perspective, if they’re going to be co-located or near to each other in remote areas. For some of these multi-gigawatt data centers, the reason they’re out in the middle of nowhere is a combination of favorable permitting and siting conditions for thousands of acres of data center building, substations and transmission –
Sorry, but even for projects not siting generation, if megawatts – if not gigawatts – are held up from coming to the grid over local conflicts, do you think that’s going to impact data center development at all? The affordability conversions? The environmental ones?
Oh yeah, I think so. In the big picture, the concern is if you can integrate large loads reliably and affordably. Governors, state lawmakers are thinking about this, and it’s bubbling up to the federal level. You need a broad set of resources on the grid to provide that adequacy. To the extent you hold up any grid resources, renewable or otherwise, you’re going to be staring down some serious challenges in serving the load. Virginia’s a good example, where local groups have held up large-scale renewable projects in the state, and Dominion’s trying to build a gas peaker plant that’s being debated, too. But in the meantime, it is Data Center Alley, and there are gigawatts of data centers that continue to want to get in and get online as quickly as possible. But the resources to serve that load are not coming online in time.
The push toward co-location probably does favor thermal generation and battery storage technologies over straight renewable energy resources. But a battery can’t cover 24/7 use cases for a data center, and neither will our unit. We’re positioned to be a bridge resource for 24/7 use for a few years until they can get more power to the market, and then we can be a flexible backup resource – not a replacement for the large-scale and transmission-connected baseload power resources, like solar and wind. Texas has benefited from huge deployments of solar and wind. That has trickled down to lower electricity costs. Those resources can’t do it alone, and there’s thermal to balance the system, but you need it all to meet the load growth.