You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
A new report from the Clean Air Task Force casts shade on “levelized cost of energy.”

Forgive me, for I have cited the levelized cost of energy.
That’s what I was thinking as I spoke with Kasparas Spokas, one of the co-authors of a new paper from the Clean Air Task Force that examines this popular and widely cited cost metric — and found it wanting.
Levelized cost of energy, or LCOE, is a simple calculation: You take a generator, like a solar panel (with a discount for future costs), and add up its operating and capital expenditures, and then divide by the expected energy output over the life of the project (also discounted).
LCOE has helped underline the economic and popular case for renewables, especially solar. And it’s cited everywhere. The investment bank Lazard produces an influential annual report comparing the LCOE of different generation sources; the latest iteration puts utility-scale solar as low as $29 per megawatt-hour, while nuclear can be as high as $222. Environmental groups cite LCOE in submissions to utilities regulators. Wall Street analysts use it to project costs. And journalists, including me, will cite it to compare the cost of, say, solar panels to natural gas.
We probably shouldn’t, according to Spokas — or at least we should be more clear about what LCOE actually means.
“We continue to see levelized cost of electricity being used in ways that we think are not ideal or not adequate to what its capabilities are,” Spokas told me.
The report argues that LCOE “is not an appropriate tool to use in the context of long-term planning and policymaking for deep decarbonization” because it doesn’t take into account factors that real-world grids and grid planners also have to consider, such as when the generator is available, whether the generator has inertia, and what supporting infrastructure (including transmission and distribution lines) a generator needs to supply power to customers.
We see these limitations and constraints on real-life grids all the time, for instance in the infamous solar “duck curve.” During the middle of the day, when the sun is highest, non-solar generation can become essentially unnecessary on a solar-heavy grid. But these grids can run into problems as the sun goes down but electricity demand persists. In this type of grid, additional solar may be low cost, but also low value — it gives you electricity when you need it the least.
“If you’re building a lot of solar in the Southwest, at some point you’ll get to the point where you have enough solar during the day that if you build an incremental amount of solar, it’s not going to be valuable,” Spokas said. To make additional panels useful, you’d have to add battery storage, increasing the electricity’s real-world cost.
Looking for new spots for renewables also amps up conflict over land use and provides more opportunities for political opposition, a cost that LCOE can’t capture. And a renewables-heavy grid can require investments in energy transmission capacity that other kinds of generation do not — you can put a gas-fired power plant wherever you can buy land and get permission, whereas utility-scale solar or wind has to be where it’s sunny or windy.
“The trend is, the more renewable penetration you have, the more costly meeting a firm demand with renewables and storage becomes,” Spokas said.
Those real-world pressures are now far more salient to grid planners than they were earlier this century, when LCOE became a popular metric to compare different types of generators.
“The rise of LCOE’s popularity to evaluate technology competitiveness also coincided with a period of stagnant load growth in the United States and Europe,” the report says. When there was sufficient generation capacity that could be ramped up and down as needed, “the need to consider various system needs and costs, such as additional transmission or firm capacity needs was relatively low.”
This is not the world we’re in today.
Demand for electricity is rising again, and the question for grid planners and policymakers now is less how to replace fossil generators going offline, and more how to meet new electricity demand in a way that can also meet society’s varied goals for cost and sustainability.
This doesn’t always have to mean maxing out new generation — it can also mean making large sources of electricity load more flexible — but it does mean making more difficult, more considered choices that take in the grid as a whole into account.
When I asked Spokas whether grid operators and grid planners needed to read this report, he chuckled and said no, they already know what’s in it. Electricity markets, as imperfect as they often are, recognize that not every megawatt is the same.
Electricity suppliers often get paid more for providing power when it’s most needed. In regions with what’s known as capacity markets, generators get paid in advance to guarantee they’ll be available when the grid needs them, a structure that ensures big payouts to coal, gas, and nuclear generators. In markets that don’t have that kind of advance planning, like Texas’ ERCOT, dispatchable generators (often batteries) can get paid for providing so-called “ancillary services,” meeting short term power needs to keep the grid in balance — a service that batteries are often ideally placed to provide.
When grid planners look at the entirety of a system, they often — to the chagrin of many renewables advocates — tend to be less enthusiastic about renewables for decarbonizing the energy system than many environmental groups, advocates, and lawmakers.
The CATF report points to Ontario, Canada where the independent system operator concluded that building a new 300-megawatt small modular nuclear reactor — practically the definition of high LCOE generation, not least because such a thing has never been deployed before in North America — would actually be less risky for electricity costs than building more battery-supported wind and solar, according to the Globe and Mail. Ontario regulators recently granted a construction license to the SMR project, which is part of a larger scheme to install four small reactors, for a total 1.2 gigawatts of capacity. To provide the equivalent supply of renewable energy would require adding between 5.6 and 8.9 gigawatts of wind and solar capacity, plus new transmission infrastructure, the system operator said, which could drive up prices higher than those for advanced nuclear.
None of this is to say that we should abandon LCOE entirely. The best use case, the report argues, is for comparing costs for the same technology over time, not comparing different technologies in the present or future. And here the familiar case for solar — that its cost has fallen dramatically over time — is borne out.
Broadly speaking, CATF calls for “decarbonization policy, industry strategy, and public debate” to take a more “holistic approach” to estimating cost for new sources of electricity generation. Policymakers “should rely on jurisdiction-specific system-level analysis where possible. Such analysis would consider all the system costs required to ensure a reliable and resilient power system and would capture infrastructure cost tradeoffs over long and uncertain-time horizons,” the report says.
As Spokas told me, none of this is new. So why the focus now?
CATF is catching a wave. Many policymakers, grid planners, and electricity buyers have already learned to appreciate all kinds of megawatts, not just the marginally cheapest one. Large technology companies are signing expensive power purchase agreements to keep nuclear power plants open or even revive them, diving into the development of new nuclear power and buying next-generation geothermal in the hope of spurring further commercialization.
Google and Microsoft have embraced a form of emissions accounting that practically begs for clean firm resources, as they try to match every hour of electricity they use with a non-emitting resource.
And it’s possible that clean firm resources could get better treatment than they currently get in the reconciliation bill working its way through Congress. Secretary of Energy Chris Wright recently called for tax credits for “baseload” power sources like geothermal and nuclear to persist through 2031, according to Foundation for American Innovation infrastructure director Thomas Hochman.
“It’s not our intention to try to somehow remove incentives for renewables specifically, but to the extent that we can preserve what we can, we’re happy if it would be used in that way,” Spokas said.
When I asked Spokas who most needed to read this report, he replied frankly, “I think climate advocates would be in that bucket. I think policymakers that have a less technical background would also be in that bucket, and media that have a less technical background would also be in there.”
I’ll keep that in mind.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
In some ways, fossil fuels make snowstorms like the one currently bearing down on the U.S. even more dangerous.
The relationship between fossil fuels and severe weather is often presented as a cause-and-effect: Burning coal, oil, and gas for heat and energy forces carbon molecules into a reaction with oxygen in the air to form carbon dioxide, which in turn traps heat in the atmosphere and gradually warms our planet. That imbalance, in many cases, makes the weather more extreme.
But this relationship also goes the other way: We use fossil fuels to make ourselves more comfortable — and in some cases, keep us alive — during extreme weather events. Our dependence on oil and gas creates a grim ouroboros: As those events get more extreme, we need more fuel.
This weekend, some 200 million Americans will be cranking up the thermostats in their natural-gas-heated homes, firing up their propane generators, or hitting icy roads in their combustion-engine cars as a major winter storm brings record-low temperatures to 35 states, knocks out power, and grinds air travel to a halt.
Climate change deniers love to use major winter storms as “proof” that global warming isn’t real. But in the case of this weekend’s polar vortex, there is evidence that Arctic warming is responsible for the record cold temperature projections across the United States.
“In the Arctic, in the winter, the ocean is much, much warmer than the atmosphere,” Judah Cohen, a climatologist at MIT and the author of a 2021 paper linking Arctic variability to extreme weather in the U.S., told me. Sea ice acts as an insulating layer separating the warmer ocean water from the frigid air. But as it melts — as it is doing every month of the year — “all of this heat can now be extracted out of the ocean.” The reduced temperature difference between the ocean and atmosphere creates wavy high-pressure ridges and low-pressure troughs that are favorable to the formation of polar vortices, which can funnel extreme cold air down over North America, as they seemingly did over Texas in 2021’s Winter Storm Uri, when 246 people died.
The exact mechanisms and interactions of this phenomenon are still up for debate. “I am in the minority that argues that there is causal link between a warm Arctic and cold continents,” Cohen added to me via email. “Most others argue that it is a coincidental relationship.” Still, scientists generally agree that extreme cold events will persist in a warming world; they’ll just become rarer.
Cold kills more people in the United States than heat, but curiously, warmer winters aren’t likely to significantly reduce these seasonal deaths. That’s because about half of the cases of excess mortality in winter are from cardiovascular diseases, which are, by nature, “highly seasonal,” Kristie Ebi, a professor of global health at the University of Washington, told me. “Since people began studying these, there are more of them in the winter than there are in the summer.” Researchers still aren’t sure why that is — though since the 1940s, we’ve known that people’s blood pressure, cholesterol, and even blood viscosity go up during the colder and darker months, perhaps due to changes in diet or exercise. That also appears to be the case regardless of climate or temperature, holding true whether you’re in Yellowknife or Miami.
In other words, “if seasonal factors other than temperature are mainly responsible for winter excess mortality, then climate warming might have little benefit,” Patrick Kinney, the director of Columbia University’s Climate and Health Program, wrote in Environmental Research Letters back in 2015. Extreme heat-related deaths, by contrast, have no ceiling, meaning global warming will result in more temperature-related deaths than it will prevent.
Our anthropogenically warmer winters could even prove to be more deadly in certain ways. Dana Tobin is a researcher at the Cooperative Institute for Research in Environmental Sciences at the University of Colorado Boulder who studies how weather affects traffic accidents. She’s found that driving in freezing rain is more dangerous than driving in snow “because of the ice glaze that it can produce on surfaces, especially those that are untreated,” she told me. As winters become warmer, there will, counterintuitively, be more ice on roads in many places, since freezing rain requires a bit of warm air before it hits the ground and becomes black ice.
Researchers working in Scandinavia have similarly found that as the atmosphere warms and more days hover around freezing, “there is a higher risk of icy conditions … which may lead to a predisposition to falls and road traffic accidents.” (As I’ve previously reported, milder winters might also make us even more depressed than very cold ones.)
There is something slightly karmic about the fact that cars become increasingly unsafe as the planet, warmed by their emissions, becomes more hazardous. But this connection gets even bleaker when carbon monoxide poisoning is factored in.
On Thursday, the North American Electric Reliability Corporation issued a statement warning that “much of North America is at an elevated risk of having insufficient energy supplies to meet demand in extreme operating conditions,” including “advancing winter weatherization of power plants and fuel acquisition to enable operations during cold temperatures.” Heavy ice can also snap branches above power lines, causing local outages.
When the power goes out or the gas lines freeze, desperate people will do anything to stay warm. That includes, in tragic cases, running improperly vented generators or plugging in propane heaters indoors, which can produce odorless and colorless CO — instead of the usual water and carbon dioxide — when fossil fuels don’t burn correctly. Accidental carbon monoxide poisoning is on the rise in the United States due to the proliferation of such appliances amid increasingly frequent extreme weather events, jumping 86% between 2012 and 2022. That’s even as, worldwide, carbon monoxide poisoning is decreasing.
Snow and ice are among the most dangerous weather conditions in the U.S., and people should take warnings of “life-threatening conditions” at face value. Tobin, the traffic researcher, stressed that one of the best protections from winter weather hazards is knowledge alone. “I believe the best thing that we can do when it comes to messaging to protect drivers from hazards is to empower motorists to make educated and informed decisions for their own safety and the safety of others,” she told me.
Winter storms highlight the entangled nature of our dependence on fossil fuels. We can’t separate extreme weather events from the energy required to survive them. But the dark irony is that, as the planet becomes more volatile, the most dangerous fossil fuels might be the ones meant to keep us warm and get us back home.
The cloak-and-dagger approach is turning the business into a bogeyman.
It’s time to call it like it is: Many data center developers seem to be moving too fast to build trust in the communities where they’re siting projects.
One of the chief complaints raised by data center opponents across the country is that companies aren’t transparent about their plans, which often becomes the original sin that makes winning debates over energy or water use near-impossible. In too many cases, towns and cities neighboring a proposed data center won’t know who will wind up using the project, either because a tech giant is behind it and keeping plans secret or a real estate firm refuses to disclose to them which company it’ll be sold to.
Making matters worse, developers large and small are requiring city and county officials to be tight-lipped through non-disclosure agreements. It’s safe to say these secrecy contracts betray a basic sense of public transparency Americans expect from their elected representatives and they become a core problem that lets activists critical of the data center boom fill in gaps for the public. I mean, why trust facts and figures about energy and water if the corporations won’t be up front about their plans?
“When a developer comes in and there’s going to be a project that has a huge impact on a community and the environment – a place they call home – and you’re not getting any kind of answers, you can tell they’re not being transparent with you,” Ginny Marcille-Kerslake, an organizer for Food and Water Watch in Pennsylvania, told me in an interview this week. “There’s an automatic lack of trust there. And then that extends to their own government.”
Let’s break down an example Marcille-Kerslake pointed me to, where Talen Energy is seeking to rezone hundreds of acres of agricultural land in Montour County, Pennsylvania, for industrial facilities. Montour County is already a high risk area for any kind of energy or data center development, ranking in the 86th percentile nationally for withdrawn renewable energy projects (more than 10 solar facilities have been canceled here for various reasons). So it didn’t help when individuals living in the area began questioning if this was for Amazon Web Services, similar to other nearby Talen-powered data center projects in the area?
Officials wouldn’t – or couldn’t – say if the project was for Amazon, in part because one of the county commissioners signed a non-disclosure agreement binding them to silence. Subsequently, a Facebook video from an activist fighting the rezoning went viral, using emails he claimed were obtained through public records requests to declare Amazon “is likely behind the scenes” of the zoning request.
Amazon did not respond to my requests for comment. But this is a very familiar pattern to us now. Heatmap Pro data shows that a lack of transparency consistently ranks in the top five concerns people raise when they oppose data center projects, regardless of whether they are approved or canceled. Heatmap researcher Charlie Clynes explained to me that the issue routinely crops up in the myriad projects he’s tracked, down to the first data center ever logged into the platform – a $100 million proposal by a startup in Hood County, Oregon, that was pulled after a community uproar.
“At a high level, I have seen a lack of transparency become more of an issue. It makes people angry in a very unique way that other issues don’t. Not only will they think a project is going to be bad for a community, but you’re not even telling them, the key stakeholder, what is going on,” Clynes said. “It’s not a matter of, are data centers good or bad necessarily, but whether people feel like they’re being heard and considered. And transparency issues make that much more difficult.”
My interview with Marcille-Kerslake exemplified this situation. Her organization is opposed to the current rapid pace of data center build-out and is supporting opposition in various localities. When we spoke, her arguments felt archetypal and representative of how easily those who fight projects can turn secrecy into a cudgel. After addressing the trust issues with me, she immediately pivoted to saying that those exist because “at the root of it, this lack of transparency to the community” comes from “the fact that what they have planned, people don’t want.”
“The answer isn’t for these developers to come in and be fully transparent in what they want to do, which is what you’d see with other kinds of developments in your community. That doesn’t help them because what they’re building is not wanted.”
I’m not entirely convinced by her point, that the only reason data center developers are staying quiet is because of a likelihood of community opposition. In fairness, the tech sector has long operated with a “move fast, break things” approach, and Silicon Valley companies long worked in privacy in order to closely guard trade secrets in a competitive marketplace. I also know from my previous reporting that before AI, data center developers were simply focused on building projects with easy access to cheap energy.
However, in fairness to opponents, I’m also not convinced the industry is adequately addressing its trust deficit with the public. Last week, I asked Data Center Coalition vice president of state policy Dan Diorio if there was a set of “best practices” that his large data center trade organization is pointing to for community relations and transparency. His answer? People are certainly trying their best as they move quickly to build out infrastructure for AI, but no, there is no standard for such a thing.
“Each developer is different. Each company is different. There’s different sizes, different structures,” he said. “There’s common themes of open and public meetings, sharing information about water use in particular, helping put it in the proper context as well.”
He added: “I wouldn’t categorize that as industry best practice, [but] I think you’re seeing common themes emerge in developments around the country.”
Plus more of the week’s biggest renewable energy fights.
Cole County, Missouri – The Show Me State may be on the precipice of enacting the first state-wide solar moratorium.
Clark County, Ohio – This county has now voted to oppose Invenergy’s Sloopy Solar facility, passing a resolution of disapproval that usually has at least some influence over state regulator decision-making.
Millard County, Utah – Here we have a case of folks upset about solar projects specifically tied to large data centers.
Orange County, California – Compass Energy’s large battery project in San Juan Capistrano has finally died after a yearslong bout with local opposition.
Hillsdale County, Michigan – Here’s a new one: Two county commissioners here are stepping back from any decision on a solar project because they have signed agreements with the developer.