You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
Concentrating solar power lost the solar race long ago. But the Department of Energy still has big plans for the technology.

Hundreds of thousands of mirrors blanket the desert of the American West, strategically angled to catch the sun and bounce its intense heat back to a central point in the sky. Despite their monumental size and futuristic look, these projects are far more under-the-radar-than the acres of solar panels cropping up in communities around the country, simply because there are so few of them.
The technology is called concentrating solar power, and it’s not particularly popular. Of the thousands of big solar projects operating in the U.S. today, less than a dozen use it.
Concentrating solar power lags for many reasons: It remains much more expensive than installations that use solar panels, it can take up a lot of land, and it can fry birds that fly too close (a narrative that’s shadowed the industry and an issue it says it’s working to alleviate). Yet the government still has big aspirations for the technology.
To meet its climate goals and avert the catastrophe that comes with significant warming, the world must roll out renewable energy sources with unprecedented speed. But while the construction of solar and wind energy is surging, renewables still face two disadvantages that fossil fuels don't: They produce electricity under certain conditions, like when the wind is blowing or the sun is shining. And there’s not a lot of research on them powering heavy industry, like cement and steel production.
That’s where concentrating solar power has an advantage. It has two big benefits that have long kept boosters invested in its success. First, concentrating solar power is usually constructed with built-in storage that's cheaper than large-scale batteries, so it can solve the intermittency challenges faced by other kinds of solar power. Plus, CSP can get super-hot — potentially hot enough for industrial processes like making cement. Taken together, those qualities allow the projects to function more like fossil fuel plants than fields of solar panels.
A few other carbon-free technologies — like nuclear power — are capable of doing much the same thing. The question is which technologies will be able to scale.
“We have goals of decarbonizing the entire energy sector, not just electricity, but the industrial sector as well, by 2050,” said Matthew Bauer, program manager for the concentrating solar-thermal power team at the Department of Energy’s Solar Technologies Office. “We think CSP is one of the most promising technologies to do that.”
In February, the Department of Energy broke ground in New Mexico on a project they see as a focal point for the future of CSP. It’s a bet that the technology can compete, despite past skepticism.
Concentrating solar plants can be built in different ways, but they’re basically engineered to bounce sun off mirrors to beam sunlight at a device called a receiver, which then heats up whatever medium is inside it. The heat can power a turbine or an engine to produce electricity. The higher the heat, the more electricity is produced and the lower the cost of producing it.
The CSP installation in New Mexico will look a lot like past projects, with a field of mirrors pointing towards a tall tower. But one element makes it particularly unique: big boxes of sand-like particles. When it’s completed next year, it will be the first known CSP project of its kind to use solid particles like sand or ceramics to transfer heat, according to Jeremy Sment, a mechanical engineer leading the team designing the project at Sandia National Laboratories.
For years, scientists sought a material that would get hot enough to improve CSP’s efficiency and costs. Past commercial CSP projects have topped out around 550 degrees Celsius. For this new project, which the Department of Energy calls “generation three,” the team is hoping to exceed 700 degrees C, and has tested the particles above 1000 degrees C, the temperature of volcanic magma.
Past projects have used oil and molten salt to absorb the sun’s heat and store it. But at blistering temperatures these materials decompose or are corrosive. In 2021, the Department of Energy decided particles were the most promising route to reach the super-hot temperatures required for efficient CSP. The team building the project considered using numerous types of particles, including red and white sand from Riyadh in Saudia Arabia and a titanium-based mineral called ilmenite. They settled on a manufactured particle from a Texas-based company, Carbo Ceramics. To build the project they need 120,000 kilograms of the stuff.
Engineers at Sandia are now working on the project’s other components. At the receiver, particles will fall like a curtain through a beam of sunlight. After they’re blasted with heat, gravity will carry them down the 175-foot tower, slowed down by obstacles that create a chute similar to a children’s marble run. They’ll offload thermal energy to “supercritical carbon dioxide” — CO2 in a fluid state — which could then power a turbine. For industrial applications, the system would be designed to allow particles to exchange heat with air or steam to heat a furnace or kiln. To store heat energy for later, the particles can be stowed in insulated steel bins within the tower until that heat is needed hours later.
The team expects construction to wrap up next year, with results for this phase of the project ready at the end of 2025. The project needs to show it can reach super-high temperatures, produce electricity using the supercritical CO2, and that it can store heat for hours, allowing the energy to be used when the sun isn’t shining.
By the Department of Energy’s technology pilot standards, the 1 megawatt project is big, but it's much smaller than most solar projects built to supply power to electric utilities and tiny compared to past CSP projects.
This could help tackle another of CSP's challenges: Projects have been uneconomic unless they’re huge. They require big plots of land and lots of money to get started. One of the most well-known CSP projects in the U.S., the 110-megawatt Crescent Dunes, cost $1 billion and covers more than 1,600 acres in Nevada. “Nothing short of a home run is deployable — I can’t just put a solar tower on my rooftop,” said Sment.
Projects that use solar panels can be as small as the footprint of a home. Overall, they’re much easier to finance and build. That’s led to more projects, which creates efficiencies and lower costs. The DOE hopes its tests will show promise for smaller, easier to deploy CSP projects.
“That’s been one of the challenges, in my opinion, that’s faced CSP historically. The projects tended to be very large, one of a kind,” said Steve Schell, chief scientist at Heliogen, a Bill Gates-backed CSP startup that’s working on a different pilot with the Department of Energy.
Heliogen went public at the end of 2021 with a valuation of $2 billion. To overcome hesitancy about the price tags usually associated with CSP, the company is targeting modular projects focused on producing green hydrogen and industrial heat, aiming to replace the fossil fuels that usually power processes like cement-making.
For companies, the CSP business has historically been tough. Some U.S. CSP startups have gone out of business, or shifted their sights to projects abroad. Despite its splashy IPO, Heliogen’s shares are worth less than 25 cents today, down from over $15 at the end of 2021. In its most recent quarterly financial report, the company downgraded its expected 2022 revenue by $8- $11 million as it works to finalize deals with customers.
Bauer at the DOE thinks the government can make technologies like CSP less risky by investing in research that takes a longer view than the one afforded by markets. And as the grid needs more large-scale storage, the value for CSP may change.
Even if CSP never becomes a significant source of generation on the grid, supporters like Shannon Yee, an associate professor of mechanical engineering at the Georgia Institute of Technology who has worked with DOE on solar technologies for years, say it could still find other potential applications in manufacturing, water treatment, or sanitation.
“We always seem to be so focused on generating electricity that we don't look at these other needs where concentrated solar may actually provide greater benefit,” said Yee. “Everything really needs sources of energy and heat. How do we do that better?”
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
CarbonPlan has a new tool to measure climate risk that comes with full transparency.
On a warming planet, knowing whether the home you’re about to invest your life savings in is at risk of being wiped out by a wildfire or drowned in a flood becomes paramount. And yet public data is almost nonexistent. While private companies offer property-level climate risk assessments — usually for a fee — it’s hard to know which to trust or how they should be used. Companies feed different datasets into their models and make different assumptions, and often don’t share all the details. The models have been shown to predict disparate outcomes for the same locations.
For a measure of the gap between where climate risk models are and where consumers want them to be, look no further than Zillow. The real estate website added a “climate risk” section to its property listings in 2024 in response to customer demand — only to axe the feature a year later at the behest of an industry group that questioned the accuracy of its risk ratings.
Now, however, a new tool that assesses wildfire risk for every building in the United States aims to advance the field through total transparency. The nonprofit research group CarbonPlan launched the free, user-friendly app called Open Climate Risk on Tuesday. It allows anyone to enter an address and view a wildfire risk score, on a scale of zero to 10, along with an explanation of how it was calculated. The underlying methodology, data, and code are all public. It’s the first fully open platform of its kind, according to CarbonPlan.
“Right now, the way science works in the climate risk space is that every model is independently developed at different companies, and we essentially have no idea what’s happening in them. We have no idea if they’re any good,” Oriana Chegwidden, a research scientist at CarbonPlan who led the creation of the tool, told me. “Our hope is that by opening this up, people will be able to start contributing, to help us learn how we can do it better.” That might mean critiquing CarbonPlan’s methods or code, for example, or re-running the model with additional data.
The score itself doesn’t tell you much other than the relative risk between one building and another. But the platform also breaks out the two inputs behind it: burn probability, or the likelihood a building will catch fire in a given year, and “conditional risk,” an estimate of how much of the building’s value would be lost if it does burn, based on projected fire intensity.
The projections are largely based on a U.S. Forest Service dataset that models fire frequency on wildlands throughout the country. CarbonPlan uses additional data on wind speed and direction to predict how a given fire might spread into an urban area.
Users can toggle between risk under the “current” climate and a “future” climate, which jumps about 20 years out. They can also see the distribution of buildings across the spectrum of risk scores at various geographic scales — by state, county, census tract, or census block.
One of CarbonPlan’s hopes is to help people become more informed consumers of climate risk data by helping them understand how it’s put together and what questions they might want to ask. While its model is more crude than others on the market, the tool is explicit about the factors that are not accounted for in the results. The loss estimates are based on a generic building, for example, and do not recognize specific traits like fire-resistant construction materials or landscaping that could make a home more fire resistant. They also don’t consider building-to-building spread. The underlying U.S. Forest Service data is also limited in that it maps vegetation across the country as it existed at the end of 2020 — any changes since then that could have reduced fire-igniting fuels, such as prescribed burns, are not incorporated.
Right now, there’s no industry standard for calculating or communicating climate risk. The Global Association of Risk Professionals recently asked 13 climate risk companies for data on floods, tropical storms, wildfires, and heat at 100 addresses to compare the outputs. The authors found there were “significant disparities,” between estimates of vulnerability and damages at the same locations. When it came to wildfires, specifically, they were unable to even compare the data, because the companies all conveyed the risk using different benchmarks.
The implications of having so many diverging methods and results extend beyond individual homebuying decisions. Insurance companies use climate risk data to set rates; publicly-traded companies use it to make disclosures to investors; policymakers use it to guide community planning and investments in adaptation. Some products might be better suited to one task or another.
Katherine Mach, an environmental science and policy professor at the University of Miami, told me the next step for the field is to have more systematic reporting requirements that help people understand how accurate the data are and what types of decisions they can be used for.
“It’s almost like we need the equivalent of industry standards,” she said. “You’re going to release a climate product? Here’s what you need to clearly communicate.”
CarbonPlan collected feedback from various likely users of the tool throughout the development process, including municipal planners, climate scientists, and consumer advocates. The group also hopes to foster an “iterative cycle of community-driven model development,” spurring other researchers to inspect the data, critique it, add to it, and spin out new versions. This is common practice in other areas of climate science, like Earth system modeling and economic modeling, and has been instrumental in advancing those fields. “There’s nothing like that for climate risk right now,” Chegwidden said.
The first step will be raising more money to support further work, but the goal is to partner with outside researchers on comparative analyses and case studies. Tracy Aquino Anderson, CarbonPlan’s interim executive director, told me they have already heard from one researcher who has a fire risk dataset that could be added to the platform. The group has also been invited to present the platform to two academic climate research groups later this spring.
The problem of black box models exists not just because the field is full of private companies that don’t want to share their code. A study published earlier this month found that only 4% of the most-cited peer-reviewed climate risk studies have made their data and code public, despite journal standards that require transparency.
“When you’re working with climate data, you’re dealing with all of these uncertainties,” Adam Pollack, an assistant professor at the University of Iowa who researches flood risk and the lead author of the paper, told me. “Researchers don’t always understand all of the assumptions that are implicit in choices that they make. That’s fine — we have methods for dealing with that. We do model intercomparisons, we do these synthesis studies as a field. The foundation of that is openness and reusability.”
Though he was not involved in the CarbonPlan project, he said it was exactly what his paper was calling for. For example, CarbonPlan’s “future” calculations are based on an extreme warming scenario that has become controversial among climate scientists. CarbonPlan didn’t choose this scenario — it’s what the Forest Service’s dataset used, and that was the only off-the-shelf data available for the entire United States. But because the underlying code is open-source, critics are free to swap it out for other data they may have access to.
“That’s what’s so great about this,” Pollack said. “People who have different values, assumptions, and expertise, can get new estimates and build a shared understanding.”
On BYD’s lawsuit, Fervo’s hottest well, and China’s geologic hydrogen
Current conditions: A midweek clipper storm is poised to bring as much as six more inches of snow to parts of the Great Lakes and Northeast • American Samoa is halfway through three days of fierce thunderstorms and temperatures above 80 degrees Fahrenheit • Northern Portugal is bracing for up to four inches more of rain after three deadly storms in just two weeks.

The Environmental Protection Agency is preparing this week to repeal the Obama-era scientific finding that provides the legal basis for virtually all federal regulations of planet-heating emissions, marking what The Wall Street Journal called “the most far-reaching rollback of U.S. climate policy to date.” The 2009 “endangerment finding” concluded that greenhouse gases pose a threat to public health and welfare, calling for cuts to emissions from power plants and vehicle tailpipes. EPA Administrator Lee Zeldin told the newspaper the move “amounts to the largest act of deregulation in the history of the United States.” In an interview with my colleague Emily Pontecorvo last year, Harvard Law School’s Jody Freeman said rescinding the endangerment finding would do “more serious and more long term damage” and “could knock out a future administration from trying to” bring back climate policy. But that, Freeman said, would depend on the Supreme Court backing the administration. “I don’t think that’s likely, but it’s possible,” she said.
At issue is the 2007 case Massachusetts v. EPA, which determined that greenhouse gases qualified as pollutants under the Clean Air Act. As Emily wrote last week, “the agency claims that its previous read of Massachusetts v. EPA was wrong, especially in light of subsequent Supreme Court decisions, such as West Virginia v. EPA and Loper Bright v. Raimondo. The former limited the EPA's toolbox for regulating power plants, and the latter ended a requirement that courts defer to agency expertise in cases where the law is vague.” An earlier report in The Washington Post questioned whether the agency would proceed with the repeal at all, fearing these arguments would pass muster in the nation’s highest court.
BYD has sued the United States government over the 100% tariff on Chinese electrics that serves as an effective ban on Beijing’s booming auto exports. Four U.S.-based subsidiaries of the world’s largest manufacturer of electric vehicles filed a lawsuit in the U.S. Court of International Trade challenging the legality of the Trump administration’s trade levies. The litigation marks what the state-backed tabloid Global Times called “the first instance of a Chinese automaker directly and actively challenging U.S. tariffs, setting a precedent and carrying significance for Chinese enterprises to protect their legitimate rights and interests through legal means.”
Outside the U.S., BYD is booming. China’s cheap electric cars are popular all over the world, as Heatmap’s Shift Key podcast covered in December. Canadian Prime Minister Mark Carney’s deal to increase trade with China will bring the battery-powered vehicles to North American roads. And the Chinese edition of the trade publication Automotive News just reported that BYD is planning a factory expansion in Europe and Canada.
Hot off last month’s news that it plans to go public, Fervo Energy has drilled its highest-temperature well yet. The drilling results confirm that the next-generation geothermal startup tapped into a resource with temperatures above 555 degrees Fahrenheit at approximately 11,200 feet deep. The company announced the findings Monday of an independent assessment using appraisal data from the drilling. The analysis found that the Project Blanford site in Millard County, Utah, has multiple gigawatts of heat that can be harnessed. Its completion will be a breakthrough for enhanced geothermal systems, one of two leading approaches to the next-generation geothermal sector that Heatmap’s Matthew Zeitlin outlined here. “This latest ultra-high temperature discovery highlights our team’s ability to detect and develop EGS sweet spots using AI-enhanced geophysical techniques,” Jack Norbeck, Fervo’s co-founder and chief technology officer, said in a statement.
Sign up to receive Heatmap AM in your inbox every morning:
Chinese scientists have for the first time discovered natural hydrogen sealed in microscopic inclusions near Tibet. The finding, which the Xinhua news agency called “groundbreaking,” fills what the China Hydrogen Bulletin called “a major domestic research gap and points to a new geological pathway for identifying China’s next generation of clean energy resources.” Natural, or geological, hydrogen could provide a cheap source of the zero-carbon fuel and give oil and gas drillers a natural foothold in a new, clean industry. In the color spectrum associated with hydrogen, the rare, naturally formed stuff is called white hydrogen. But as Heatmap’s Katie Brigham wrote in December, a new color has joined the rainbow. Orange hydrogen refers to a family of technologies that naturally spur production of the gas, as the startup Vema is now attempting to do.
China’s coal-fired power generation decreased 1.9% last year, marking what the consultancy Wood Mackenzie called “a historic shift driven by new non-fossil generation that has finally outpaced demand growth.” Power demand surged 5% in China last year, but for the first time in a decade that wasn’t propelled by coal plants. Instead, that new demand was supplied by renewables, nuclear, and hydro, all of which Beijing has rapidly deployed. Over that time, the levelized cost of energy — a widely used though, as Matthew wrote last year, far-from-perfect metric — fell 77% for utility-scale solar and 73% for onshore wind. “At the heart of this transformation is the unprecedented expansion of renewable energy capacity,” Sharon Feng, a senior research analyst for Wood Mackenzie, said in a statement. “China’s wind and solar capacity had risen more than ten-fold to 1,842 gigawatts over the past decade.”
Gone are the days when the oil industry seemed to be on track for a lucrative decline. Demand for crude will take longer to peak than previously estimated as governments prioritize growth and energy security over efforts to curb consumption. That’s according to a report issued Sunday by Vitol Group, the world’s largest independent oil trader. “Over the past year, decarbonisation policies have become a less decisive driver of efforts to curb oil consumption and reduce carbon dioxide emissions,” the report stated, according to Bloomberg. “Policy priorities have increasingly been reframed around economic competitiveness and geopolitical strategy.”
The race for a long-duration energy storage solution has a new competitor. The Dutch startup Ore Energy has deployed its iron-air storage technology successfully on the grid for a technical pilot of its system that can store for 100 hours of power. The pilot, the first of its kind in Europe, demonstrated that the company’s technology can store and discharge energy for up to four days. “This pilot allowed us to evaluate iron-air performance under European operating profiles and real-world grid conditions,” Aytaç Yilmaz, co-founder and CEO of Ore Energy, said in a statement.
Wildfires are moving east.
There were 77,850 wildfires in the United States in 2025, and nearly half of those — 49% — ignited east of the Mississippi River, according to statistics released last week by the National Interagency Fire Center. That might come as a surprise to some in the West, who tend to believe they hold the monopoly on conflagrations (along with earthquakes, tsunamis, and megalomaniac tech billionaires).
But if you lump the Central Plains and Midwest states of Minnesota, Iowa, Missouri, Arkansas, Oklahoma, and Texas along with everything to their east — the swath of the nation collectively designated as the Eastern and Southern Regions by the U.S. Forest Service — the wildfires in the area made up more than two-thirds of total ignitions last year.

Like fires in the West, wildfires in the eastern and southeastern U.S. are increasing. Over the past 40 years, the region has seen a 10-fold jump in the frequency of large burns. (Many risk factors contribute to wildfires, including but not limited to climate change.)
What’s exciting to wildfire researchers and managers, though, is the idea that they could catch changes to the Eastern fire regime early, before the situation spirals into a feedback loop or results in a major tragedy. “We have the opportunity to get ahead of the wildfire problem in the East and to learn some of the lessons that we see in the West,” Donovan said.
Now that effort has an organizing body: the Eastern Fire Network. Headed by Erica Smithwick, a professor in Penn State’s geography department, the research group formed late last year with the help of a $1.7 million, three-year grant from the Gordon and Betty Moore Foundation, a partner with the U.S. National Science Foundation, with the goal of creating an informed research agenda for studying fire in the East. “It was a very easy thing to have people buy into because the research questions are still wide open here,” Smithwick told me.
Though the Eastern U.S. is finally exiting a three-week block of sub-freezing temperatures, the hot, dry days of summer are still far from most people’s minds. But the wildland-urban interface — that is, the high-fire-risk communities that abut tracts of undeveloped land — is more extensive in the East than in the West, with up to 72% of the land in some states qualifying as WUI. The region is also much more densely populated, meaning practically every wildfire that ignites has the potential to threaten human property and life.
It’s this density combined with the prevalent WUI that most significantly distinguishes Eastern fires from those in the comparatively rural West. One fire manager warned Smithwick that a worst-case-scenario wildfire could run across the entirety of New Jersey, the most populous state in the nation, in just 48 hours.
Generally speaking, though, wildfires in the East are much smaller than those in the West. The last megafire in the Forest Service’s Southern Region was as far west in its boundaries as you can get: the 2024 Smokehouse Creek fire in Texas and Oklahoma, which burned more than a million acres. The Eastern Region hasn’t had a megafire exceeding 100,000 acres in the modern era. For research purposes, a “large” wildfire in the East is typically defined as being 200 hectares or more in size, the equivalent of about 280 football fields; in the West, a “large” wildfire is twice that, 400 hectares or more.
But what the eastern half of the country lacks in total acres burned (for that statistic, Alaska edges out the Southern Region), it makes up for in the total number of reported ignitions. In 2025, for example, the state of Maine alone recorded 250 fires in August, more than doubling its previous record of just over 100 fires. “The East is highly fragmented,” Donovan, who is contributing to the Eastern Fire Network’s research, told me. “We have a lot of development here compared to the West, and so it’s much more challenging for fires to spread.”
Fires in the West tend to be long-duration events, burning for weeks or even months; fires in the East are often contained within 48 hours. In New Jersey, for example, “smaller, fragmented forests, which are broken up by numerous roads and the built environment, [allow] firefighters to move ahead of a wildfire to improve firebreaks and begin backfiring operations to help slow the forward progression,” a spokesperson for the New Jersey Forest Fire Service told me.
The parcelized nature of the eastern states is also reflected in who is responding to the fires. It is more common for state agencies and local departments — including many volunteer firefighting departments — to be the ones on the scene, Debbie Miley, the executive director of the National Wildfire Suppression Association, a trade group representing private wildland fire service contractors, told me by email. On the one hand, the local response makes sense; smaller fires require smaller teams to fight them. But the lack of a joint effort, even within a single state, means broader takeaways about mitigation and adaptation can be lost.
“Many eastern states have strong state forestry agencies and local departments that handle wildfire as part of an ‘all hazards’ portfolio,” Miley said. “In the West, there’s often a deeper bench of personnel and systems oriented around long-duration wildfire campaigns (though that varies by state).”
All of this feeds into why Smithwick believes the Eastern Fire Network is necessary: because of this “intermingling, at a very fine scale, of different jurisdictional boundaries,” conversations about fire management and the changing regimes in the region happen in parallel, rather than with meaningful coordination. Even within a single state, fire management might be divided between different agencies — such as the Game Commission and the Bureau of Forestry, which share fire management responsibilities in Pennsylvania. Fighting fires also often involves working with private landowners in the East; in the West, on the other hand, roughly two-thirds of wildfires burn on public land, which a single agency — e.g. the Bureau of Land Management, Forest Service, or Park Service — manages.
But “wildfire risk is going to be different than in the West, and maybe more variable,” Smithwick told me. Identifying the appropriate research questions about that risk is one of the most important objectives of the Eastern Fire Network.
Bad wildfires are the result of fuel and weather conditions aligning. “We generally know what the fuels are [in the East] and how well they burn,” Smithwick said. But weather conditions and their variability are a greater question mark.
Nationally, fire and emergency managers rely on indices to predict fire-weather risk based on humidity, temperature, and wind. But while those indices are dialed in for the Western states, they’re less well understood in the East. “We hope to look at case studies of recent fires that have occurred in the 2024 and 2025 window to look at the antecedent conditions and to use those as case studies for better understanding the mechanisms that led to that wildfire,” Smithwick said.
Learning more about the climatological mechanisms driving dry spells in the region is another explicit goal. Knowing how dry spells evolve, and where, will help researchers and eventually policymakers to identify mitigation strategies for locations most at risk. Smithwick also expects to learn that some areas might not be at high risk: “We can tell you that this is not something your community needs to invest in right now,” she told me.
Different management practices, jurisdictions, terrains, and fuel types mean solutions in the East will look different from those in the West, too. As Donovan’s research has found, the unmanaged regrowth of forests in the northeast in particular after centuries of deforestation has led to an increase in trees and shrubs that are prone to wildfires. Due to the smaller forest tracts in the area, mechanical thinning is a more realistic solution in eastern forests than on large, sprawling, remote western lands.
Prescribed burns tend to be more common and more readily accepted practices in the East, too. Florida leads the nation in preventative fires, and the New Jersey Forest Fire Service aims to treat 25,000 acres of forest, grasslands, and marshlands with prescribed fire annually.
The winter storms that swept across the Eastern and Southern regions of the United States last month have the potential to queue up a bad fire season once the land starts to thaw and eventually dry out. Though the picture in the Eastern Region is still coming into focus depending on what happens this spring, in the Southern region the storms have created “potential compaction of the abundant grasses across the Plains, in addition to ice damage in pine-dominant areas farther east,” the National Interagency Fire Center wrote in last Monday’s update to its nationwide fire outlook. (The nearly million-acre Pinelands of New Jersey are similarly a fire-adapted ecosystem and are “comparable in volatility to the chaparral shrublands found in California and southern Oregon,” the spokesperson told me.)
The compaction of grasses is significant because, although they will take longer to dry and become a fuel source, it will ultimately leave the Southern region covered with a dense, flammable fuel when summer is in full swing. Beyond the Plains, in the Southeast’s pine forests, the winter-damaged trees could cast “abundant” pine needles and “other fine debris” that could dry out and become flammable as soon as a few weeks from now. “Increased debris burning will also amplify ignitions and potential escapes, enhancing significant fire potential during warmer and drier weather that will return in short order,” NIFC goes on to warn.
Though the historically wet Northeast and humid Southeast seem like unlikely places to worry about large wildfires, as conditions change, nothing is certain. “If we learned anything from fire science over the past few decades, it’s that anywhere can burn under the right conditions,” Smithwick said. “We are burning in the tundra; we are burning in Canada; we are burning in all of these places that may not have been used to extreme wildfire situations.”
“These fires could have a large economic and social cost,” Smithwick added, “and we have not prepared for them.”