You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
We didn’t know it was coming. We didn’t know where it came from. We still can’t nail its climate change connection.

What happened?
No, seriously — what happened?
Last week, the American megalopolis, that string of jewels on the old Atlantic coast, found itself shrouded by wildfire smoke. New York City’s air turned ashen and dun, then glowed a supernatural amber. For the first time in who-knows, sightseers standing at the U.S. Capitol Building could not see the Washington Monument, a mile and change down the Mall.
You could list the sports games canceled or flights delayed, but what was oddest about the event was the sheer ubiquity of it. This was one of the few news stories I can remember where you could look up from whatever article you were reading and see the story itself, softly lapping at your window.
And then it was gone. By Friday, the haze had blown out to sea.
It was, in retrospect, a strange time — deeply strange, humblingly strange, strange before almost any other quality. More than 128 million Americans were under an air-quality alert on Wednesday night — roughly the population of Germany and Spain combined — but scarcely 36 hours earlier, nobody had known to prepare for anything worse than a moderate haze. The country’s biggest wildfire-pollution event on record arrived essentially out of nowhere.
Get one great climate story in your inbox every day:
One of the main modes of journalism today — but also, frankly, one of the main frames of our whole cultural apparatus, from TikToks that lightly gloss Wikipedia articles to rampant right-wing conspiracism — is that of the explanation. Everyone explains what’s happening to each other, as it happens, at all times, and therefore makes the world seem rational, empirical, and less frightening. Yet with the wildfire smoke, what is so striking is how little we understood. Nearly every important step in the story was misapprehended as it happened.
We didn’t know that the smoke was coming, for instance. On Tuesday morning, meteorologists predicted that the same moderate haze that has hung around all season would again hit the East Coast. The New York Department of Environmental Conservation put out an alert saying that the air quality index, or AQI, might rise to 150 across the state.
They did not forecast — nobody, as far as I can tell, did — that the worst air pollution in decades would soon wallop the state. By Tuesday night, New York City’s AQI had already reached 174, according to Environmental Protection Agency data. As I walked home in D.C., I could see tendrils of visible smoke hugging the upper stories of apartment buildings.
This prediction failure gave the ensuing response a halting, confused quality. How could such a massive event come out of nowhere? Not until Thursday afternoon — when the smoke had nearly passed — did the federal government advise its workers that they could telework or take vacation time to avoid the bad air. On Friday, New York closed in-person schools, just in time for blue sky to return.
I find it hard to blame them. This was an unprecedented event in part because the fires were so far from where they affected. Everyone had seen the videos of smoke besieging Portland and San Francisco in 2020, but back then, the fires had been near those cities — a couple hundred miles away at most. Where was the smoke coming from now? The Adirondacks were fine. Vermont was’t burning.
Here, we misunderstood again. Many outlets — including this one, at first — initially reported that the smoke came from Nova Scotia, where large and destructive fires had raged the week before. But those fires had been doused over the weekend by some of the same weather pattern that was now ferrying smoke to us. In fact, the smoke had come from the boreal forests of northern Quebec, more than 500 miles from New York City.
Why were these fires raging? Not even Canadians could give a good answer. With fires in Alberta and Nova Scotia gobbling attention and resources, the Quebec fires had seemingly been an afterthought until their smoke blew into Toronto and Ottawa, which happened only a few hours before it arrived in New York. Suddenly, a secondary event had become the main event.
On Wednesday, I talked to a Canadian climatologist who seemed hazy about why Quebec was burning in the first place. “I think this situation is kind of similar to Nova Scotia,” he told me, blaming that province’s warm, dry spring for the blazes. But this explanation — which appeared in many outlets — was only somewhat true: While Quebec had suffered a warm May, it was not in drought.
We did not understand why these fires are burning — and honestly, we still don’t. President Joe Biden said that the smoke provided “another stark reminder of the impacts of climate change.” I am not so sure. There’s no doubt, to be clear, that climate change will make wildfires worse across North America: The Intergovernmental Panel on Climate Change says that hot, dry “fire weather” will increase throughout the 21st century. But, again, Quebec is not in drought. As for today, no climate-change signal has appeared in eastern Canadian wildfire data. Their connection to climate change is far less clear cut than it is in, say, California’s blazes.
Yet neither would I condescend to someone who does blame climate change here. When something like this happens, how can you not cite the planet’s biggest ongoing physical transformation? If climate change makes flukey weather more likely, shouldn’t we at least consider it being responsible for some of the flukiest weather in decades? The thing about unprecedented events is that you lack precedent for them.
Not that we completely lack an example for this. In 1780, the sun was blotted out across New England. Nocturnal animals came out; people fretted in the streets and abandoned their work; the Connecticut state legislature considered adjourning for doomsday. Not until a decade ago did we finally learn that the “dark day” was caused by Canadian wildfire smoke drifting south.
Which suggests that this might be a once-in-250-year event. But maybe it’s not any more. Maybe with climate change, it’s a once-a-century event. Or a once-in-a-decade event. For now, the sample size is two.
So I wonder: If it wasn’t climate change, would it matter? The pandemic has already taught us that indoor air quality matters, that unseen particles floating in the air can do serious harm. No matter what happens with the climate, Canada is too large and unpopulated to fight every wildfire; neither can it manage the same kind of labor-intensive forest management that California might attempt. East Coasters should come away from our own dark days with new compassion for people out West — and those across the world — who must deal with wildfire smoke on a seasonal basis, not to mention the fires themselves. Regardless of climate change’s role in this fire, it makes wildfires more likely: We should continue to try to decarbonize as fast as we can.
But as for local policy, perhaps our aims should be humbler. We now know (again) that a great cloud of wildfire smoke can blow up on the East Coast at any moment and poison our air. We don’t need to know everything to protect ourselves and our neighbors from that. Air filters cost hundreds of dollars, but not thousands; in new multi-family buildings, they are built into the ventilation system itself. Perhaps the right lesson from this outbreak should be to change our expectations, and think of indoor air filtering like brushing your teeth — a habit essential to our hygiene, to be used by all, and to be provisioned for those who cannot afford one at the public expense.
Maybe that’s prudent climate adaptation. Or maybe — in the wake of COVID, Canadian smoke, and who-knows-what-comes-next — it’s just new common sense.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
The Trump administration has started to weaken the rules requiring cars and trucks to get more fuel-efficient every year,
In a press event on Wednesday in the Oval Office, flanked by advisors and some of the country’s top auto executives, President Trump declared that the old rules “forced automakers to build cars using expensive technologies that drove up costs, drove up prices, and made the car much worse.”
He said that the rules were part of the “green new scam” and that ditching them would save consumers some $1,000 every year. That framed the rollback as part of the president’s seeming pivot to affordability, which has happened since Democrats trounced Republicans in the November off-cycle elections.
That pivot remains belated and at least a little half-hearted: On Wednesday, Trump made no mention of dropping the auto tariffs that are raising imported car prices by perhaps $5,000 per vehicle, according to Cox Automotive. Ditching the fuel economy rules, too, could increase demand for gasoline and thus raise prices at the pump — although they remain fairly low right now, with the national average below $3 a gallon.
What’s more interesting — and worrying — is that the rules fit into the administration’s broader war on innovation in the American car and light-duty truck sector. The United States essentially has two ways to regulate pollution from cars and light trucks: It can limit greenhouse gas emissions from new cars and trucks, and it can require the fuel economy from new vehicles to get a little better every year.
Trump is pulling screws and wires out of both of these systems. In the first category, he’s begun to unwind the Environmental Protection Agency’s limits on carbon pollution from cars and light duty trucks, which he termed an “EV mandate.” (The Biden-era rules sought to require about half of new car sales be electric by 2030, although hybrids could help meet that standard.) Trump is also trying to keep the EPA from ever regulating anything to do with carbon pollution again by going after the agency’s “Endangerment Finding” — a scientific assessment that greenhouse gases are dangerous to human wellbeing.
That’s only half of the president’s war on air pollution rules, though. Since the oil crises of the 1970s, the National Highway Traffic Safety Administration has regulated fuel economy for new vehicles under the Corporate Average Fuel Economy, or CAFE, standards. When these rules are binding, the agency can require new cars and trucks sold in the U.S. to get a little more fuel-efficient every year. The idea is that these rules help limit the country’s gasoline consumption, thus keeping a lid on oil prices and letting the whole economy run more efficiently.
President Trump’s signature tax law, the One Big Beautiful Bill Act, already eliminated the fines that automakers have to pay when they fail to meet the standard. That change, pushed by Senator Ted Cruz of Texas, effectively rendered the regulation toothless. But now Trump is weakening the rules just for good measure. (At the press conference on Wednesday, Cruz stood behind the president — and next to Jim Farley, the CEO of Ford.)
Under the new Trump proposal, automakers would need to achieve only an average of 34.5 miles per gallon in 2031. Under Biden’s proposal, they needed to hit 50 miles per gallon that year.
Those numbers, I should add, are somewhat deceptive — because of how CAFE standards are calculated, the headline number is 20% to 30% stricter than a real-world fuel economy number. In essence, that means the new Trump era rules will come out to a real-world mile-per-gallon number in the mid-to-high 20s. That will give automakers ample regulatory room to sell more inefficient and gas-guzzling sport utility vehicles and pickups, which remain more profitable than electric vehicles.
Which is not ideal for air pollution or the energy transition. But the real risk for the American automaking industry is not that Ford might churn out a few extra Escapes over the next several years. It’s that the Trump proposal would eliminate the ability for automakers to trade compliance credits to meet the rules. These credit markets — which allow manufacturers of gas guzzlers to redeem themselves by buying credits generated by cleaner cars — have been a valuable revenue source for new vehicle companies like Tesla, Lucid, and Rivian. The Trump proposal would cut off that revenue — and with it, one of the few remaining ways that automakers are cross-subsidizing EV innovation in the United States.
During his campaign, President Trump said that he wanted the “cleanest air.” That promise is looking as incorrect as his pledge to cut electricity costs in half within a year.
How will America’s largest grid deal with the influx of electricity demand? It has until the end of the year to figure things out.
As America’s largest electricity market was deliberating over how to reform the interconnection of data centers, its independent market monitor threw a regulatory grenade into the mix. Just before the Thanksgiving holiday, the monitor filed a complaint with federal regulators saying that PJM Interconnection, which spans from Washington, D.C. to Ohio, should simply stop connecting new large data centers that it doesn’t have the capacity to serve reliably.
The complaint is just the latest development in a months-long debate involving the electricity market, power producers, utilities, elected officials, environmental activists, and consumer advocates over how to connect the deluge data centers in PJM’s 13-state territory without further increasing consumer electricity prices.
The system has been pushed into crisis by skyrocketing capacity auction prices, in which generators get paid to ensure they’re available when demand spikes. Those capacity auction prices have been fueled by high-octane demand projections, with PJM’s summer peak forecasted to jump from 154 gigawatts to 210 gigawatts in a decade. The 2034-35 forecast jumped 17% in just a year.
Over the past two two capacity auctions, actual and forecast data center growth has been responsible for over $16.6 billion in new costs, according to PJM’s independent market monitor; by contrast, the previous year’s auction generated a mere $2.2 billion. This has translated directly to higher retail electricity prices, including 20% increases in some parts of PJM’s territory, like New Jersey. It has also generated concerns about reliability of the whole system.
PJM wants to reform how data centers interconnect before the next capacity auction in June, but its members committee was unable to come to an agreement on a recommendation to PJM’s board during a November meeting. There were a dozen proposals, including one from the monitor; like all the others, it failed to garner the necessary two-thirds majority vote to be adopted formally.
So the monitor took its ideas straight to the top.
The market monitor’s complaint to the Federal Energy Regulatory Commission tracks closely with its plan at the November meeting. “PJM is currently proposing to allow the interconnection of large new data center loads that it cannot serve reliably and that will require load curtailments (black outs) of the data centers or of other customers at times. That result is not consistent with the basic responsibility of PJM to maintain a reliable grid and is therefore not just and reasonable,” the filing said. “Interconnecting large new data center loads when adequate capacity is not available is not providing reliable service.”
A PJM spokesperson told me, “We are still reviewing the complaint and will reserve comment at this time.”
But can its board still get a plan to FERC and avoid another blowout capacity auction?
“PJM is going to make a filing in December, no matter what. They have to get these rules in place to get to that next capacity auction in June,” Jon Gordon, policy director at Advanced Energy United, told me. “That’s what this has been about from the get-go. Nothing is going to stop PJM from filling something.”
The PJM spokesperson confirmed to me that “the board intends to act on large load additions to the system and is expected to provide an indication of its next steps over the next few weeks.” But especially after the membership’s failure to make a unified recommendation, what that proposal will be remains unclear. That has been a source of agita for the organizations’ many stakeholders.
“The absence of an affirmative advisory recommendation from the Members Committee creates uncertainty as to what reforms PJM’s Board of Managers may submit to the Federal Energy Regulatory Commission (FERC), and when stakeholders can expect that submission,” analysts at ClearView Energy Partners wrote in a note to clients. In spite of PJM’s commitments, they warned that the process could “slip into January,” which would give FERC just enough time to process the submission before the next capacity auction.
One idea did attract a majority vote from PJM’s membership: Southern Maryland Electric Cooperative’s, which largely echoed the PJM board’s own plan with some amendments. That suggestion called for a “Price Responsive Demand” system, in which electricity customers would agree to reduce their usage when wholesale prices spike. The system would be voluntary, unlike an earlier PJM proposal, which foresaw forcing large customers to curtail their power. “The load elects to not take on a capacity obligation, therefore does not pay for capacity, and is required to reduce demand during stressed system conditions,” PJM explained in an update. The Southern Maryland plan tweaks the PRD system to adjust its pricing mechanism. but largely aligns with what PJM’s staff put forward.
“There’s almost no real difference between the PJM proposal and that Southern Maryland proposal,” Gordon told me.
That might please restive stakeholders, or at least be something PJM’s board could go forward with knowing that the balance of its voting membership agreed with something similar.
“We maintain our view that a final proposal could resemble the proposed solution package from PJM staff,” the ClearView note said. “We also think the Board could propose reforms to PJM’s PRD program. Indeed, as noted above, SMECO’s revisions to the service gained majority support.”
The PJM plan also included relatively uncontroversial reforms to load forecasting to cut down on duplicated requests and better share information, and an “expedited interconnection track” on which new, large-scale generation could be fast-tracked if it were signed off on by a state government “to expedite consideration of permitting and siting.”
Gordon said that the market monitor’s complaint could be read as the organization “desperately trying to get FERC to weigh in” on its side, even if PJM is more likely to go with something like its own staff-authored submission.
“The key aspect of the market monitor’s proposal was that PJM should not allow a data center to interconnect until there was enough generation to supply them,” Gordon explained. During the meeting preceding the vote, “PJM said they didn’t think they had the authority to deny someone interconnection.”
This dispute over whether the electricity system has an obligation to serve all customers has been the existential question making the debate about how to serve data centers extra angsty.
But PJM looks to be trying to sidestep that big question and nibble around the edges of reform.
“Everybody is really conflicted here,” Gordon told me. “They’re all about protecting consumers. They don’t want to see any more increases, obviously, and they want to keep the lights on. Of course, they also want data center developers in their states. It’s really hard to have all three.”
Atomic Canyon is set to announce the deal with the International Atomic Energy Agency.
Two years ago, Trey Lauderdale asked not what nuclear power could do for artificial intelligence, but what artificial intelligence could do for nuclear power.
The value of atomic power stations to provide the constant, zero-carbon electricity many data centers demand was well understood. What large language models could do to make building and operating reactors easier was less obvious. His startup, Atomic Canyon, made a first attempt at answering that by creating a program that could make the mountains of paper documents at the Diablo Canyon nuclear plant, California’s only remaining station, searchable. But Lauderdale was thinking bigger.
In September, Atomic Canyon inked a deal with the Idaho National Laboratory to start devising industry standards to test the capacity of AI software for nuclear projects, in much the same way each update to ChatGPT or Perplexity is benchmarked by the program’s ability to complete bar exams or medical tests. Now, the company’s effort is going global.
On Wednesday, Atomic Canyon is set to announce a partnership with the United Nations International Atomic Energy Agency to begin cataloging the United Nations nuclear watchdog’s data and laying the groundwork for global standards of how AI software can be used in the industry.
“We’re going to start building proof of concepts and models together, and we’re going to build a framework of what the opportunities and use cases are for AI,” Lauderdale, Atomic Canyon’s chief executive, told me on a call from his hotel room in Vienna, Austria, where the IAEA is headquartered.
The memorandum of understanding between the company and the UN agency is at an early stage, so it’s as yet unclear what international standards or guidelines could look like.
In the U.S., Atomic Canyon began making inroads earlier this year with a project backed by the Institute of Nuclear Power Operators, the Nuclear Energy Institute, and the Electric Power Research Institute to create a virtual assistant for nuclear workers.
Atomic Canyon isn’t the only company applying AI to nuclear power. Last month, nuclear giant Westinghouse unveiled new software it’s designing with Google to calculate ways to bring down the cost of key components in reactors by millions of dollars. The Nuclear Company, a startup developer that’s aiming to build fleets of reactors based on existing designs, announced a deal with the software behemoth Palantir to craft the software equivalent of what the companies described as an “Iron Man suit,” able to swiftly pull up regulatory and blueprint details for the engineers tasked with building new atomic power stations.
Lauderdale doesn’t see that as competition.
“All of that, I view as complementary,” he said.
“There is so much wood to chop in the nuclear power space, the amount of work from an administrative perspective regarding every inch of the nuclear supply chain, from how we design reactors to how we license reactors, how we regulate to how we do environmental reviews, how we construct them to how we maintain,” he added. “Every aspect of the nuclear power life cycle is going to be transformed. There’s no way one company alone could come in and say, we have a magical approach. We’re going to need multiple players.”
That Atomic Canyon is making inroads at the IAEA has the potential to significantly broaden the company’s reach. Unlike other energy sources, nuclear power is uniquely subject to international oversight as part of global efforts to prevent civilian atomic energy from bleeding over into weapons production.
The IAEA’s bylaws award particular agenda-setting powers to whatever country has the largest fleet of nuclear reactors. In the nearly seven decades since the agency’s founding, that nation has been the U.S. As such, the 30 other countries with nuclear power have largely aligned their regulations and approaches to the ones standardized in Washington. When the U.S. artificially capped the enrichment levels of traditional reactor fuel at 5%, for example, the rest of the world followed.
That could soon change, however, as China’s breakneck deployment of new reactors looks poised to vault the country ahead of the U.S. sometime in the next decade. It wouldn’t just be a symbolic milestone. China’s emergence as the world’s preeminent nuclear-powered nation would likely come with Beijing’s increased influence over other countries’ atomic energy programs. As it is, China is preparing to start exporting its reactors overseas.
The role electricity demand from the data centers powering the AI boom has played in spurring calls for new reactors is undeniable. But if AI turns out to have as big an impact on nuclear operations as Lauderdale predicts, an American company helping to establish the global guidelines could help cement U.S. influence over a potentially major new factor in how the industry works for years, if not decades to come.