You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
On a crucial — and underappreciated — phrase in the Global Stocktake.

Now it is over. Early on Wednesday morning, negotiators in Dubai reached an agreement at the 28th Conference of the Parties to the UN Framework Convention on Climate Change, the global meeting otherwise known as COP28.
Their final text for the Global Stocktake — a kind of report card on humanity’s progress on its Paris Agreement goals — is contradictory and half-hearted. Instead of blunt language instructing countries to “phase out fossil fuels,” it instead provides a range of options that could let countries achieve “deep, rapid, and sustained reductions in greenhouse gas emissions.” One of these possibilities is the tripling of global renewable capacity; another is a call for “transitioning away from fossil fuels.”
So far, this language — this call for leaving fossil fuels — has attracted the most attention by far. Simon Stiell, the UN’s top climate official, said that it marked “the beginning of the end” of the fossil-fuel era, while the climate journalist and activist Bill McKibben has argued that the phrase can become a useful tool for activists, who can now beat it across the head of the Biden administration.
But a separate phrase in the agreement caught my attention. Immediately after calling for transitioning away from fossil fuels, the text makes a different point: that the world must accelerate the development of “zero- and low-emission technologies, including, inter alia, renewables, nuclear, abatement and removal technologies such as carbon capture and utilization and storage, particularly in hard-to-abate sectors, and low-carbon hydrogen production.”
This language may rankle some readers because it seems to give pride of place to carbon capture and storage technology, or CCS, which would allow fossil fuel-burning plants to catch emissions before they enter the atmosphere. (It also seems to conflate CCS with carbon removal technology, even though they are different.) But I believe that the overarching demand — the call for accelerating climate-friendly technologies — represents a crucial insight, one that I could not stop thinking about at the COP itself, and one that is linked to any realistic demand to phase out fossil fuels. Here is that insight: The world will only be able to decarbonize when it develops abundant energy technologies that emit little carbon and that are price-competitive if not cheaper than their fossil-fueled alternatives.
Just as COP28 began, the Rhodium Group, an energy research firm, published a new study looking at how carbon pollution will rise and fall through the end of the century. Unlike other such studies — which ask either how the planet will fare if no new climate policy passes, or what the world must do to avoid 1.5 degrees Celsius of warming — this new study tried to look at what was likely to happen. Given what we know about how countries’ emissions rise and fall with their economies, and when and how they tend to pass climate policy, how much warming can we expect by the end of the century?
As the report’s authors put it, the study was aimed not at policymakers, but at policy takers — the officials, executives, engineers, and local leaders who are starting to plan for the world of 2100.
Here’s the good news: Global greenhouse gas emissions are likely to peak this decade, the report found. Sometime during the 2020s, humanity’s emissions of carbon dioxide, methane, and other climate pollution will reach an all-time high and begin to fall. (Right now, we emit the equivalent of 50.6 billion tons of the stuff every year.) This will represent a world-historic turning point in our species’ effort to govern the global climate system, and it will probably happen before Morocco, Portugal, and Spain host the 2030 World Cup.
And that is roughly where the good news ends. Because unlike in rosy net-zero studies where humanity’s carbon emissions peak and then rapidly fall to zero, the report does not project any near-term pollution plunge. Instead, global emissions waver and plateau through the 2030s and 2040s, falling in some years, rising slightly in others, cutting an unmistakably downward trend while failing to get anywhere close to zero. By 2060, annual emissions will have fallen to 39 gigatons, only 22% below today’s levels.
And — worse news, now — that is as low as emissions will ever get this century, the report projects. Driven by explosive economic growth in Southeast Asia and sub-Saharan Africa, global emissions begin to rise — slowly but inexorably — starting in the 2060s. They keep rising in the 2070s, 2080s, and 2090s. By the year 2090, emissions will have reached 44 gigatons, only 13% below today’s levels and roughly where emissions stood in 2003.
How Greenhouse Gas Emissions Could Fall — Then Rise — in the 21st Century

In other words, after a century of work to fight climate change, humanity will find itself roughly where it began. But now, with several thousand additional gigatons of emissions in the atmosphere, the planet will be about 2.8 degrees Celsius warmer (or about 5 degrees Fahrenheit). At its high end estimate, temperatures could rise as much as 4 degrees Celsius, or more than 7 degrees Fahrenheit.
This temperature rise will be caused by legacy emissions from polluters like the United States and China, but as the century goes on, it will increasingly come from Asian and African countries such as Vietnam, Indonesia, Nigeria, Kenya, and others. Why? It’s not like these countries, say, reject renewables or electric vehicles: In fact, Rhodium anticipates that renewables will have grown up to 22-fold by the end of the century.
Instead, emissions rise because fossil fuels are cheap and globally abundant — they remain one of the easiest ways to power an explosively growing society — and because of the growth of the so-called hard-to-abate sectors in these countries are slated to grow just as quickly as the economies themselves. Indonesia, Nigeria, and Vietnam will demand many megatons of new steel, cement, and chemicals to furnish their growing societies; right now, the only economical way to make those materials requires releasing immense amounts of carbon pollution into the atmosphere.
Let’s be clear: Rhodium’s report is a projection, not a prophecy. It should not provoke despair, I think, but determination. Many of the so-called hard-to-abate activities, such as steel or petrochemical making, should more aptly be called activities-that-we-haven’t-tried-very-hard-to-abate yet; people will likely find a way to do them by the middle of the century. (When I asked Bill Gates what he thought about the Rhodium Group’s findings, he replied that predicting the carbon intensity of certain activities in 2060 was all but impossible: We might have safe, cheap, and abundant nuclear fission by then, or even nuclear fusion.)
Yet it heralds a shift in climate geopolitics that, while it has not yet happened, is not so far away. Since the modern era of global climate politics began in 1990, most carbon emissions have come from just a handful of countries: China, the United States, and the 37 other rich, developed democracies that make up the Organization for Economic Cooperation and Development, or OECD. These countries have emitted 55% of climate pollution since 1990, while the rest of the world — the remaining low- and middle-income countries — have emitted only 45%.
But from now to 2100, that relationship is set to reverse. Through the end of the century, China and the OECD countries emit only 40% of total global emissions, according to Rhodium’s projections. The rest of the world, meanwhile, will emit 60% of global emissions.
In other words, decarbonization will soon become a challenge for middle-income countries. These countries will not be able to spend extra to buy climate-friendly technologies, but they are simply too populous for rich countries to subsidize. At the same time, these countries lack an existing fleet of fossil-fuel-consuming equipment, so they will not need to transition away from fossil fuels in the first place. Unlike in the United States, where we will have to shut down our oil-and-gas economy as we build a new one to replace it, Kenya or Indonesia can more or less build a climate-friendly middle-class economy de novo, much in the same way that in the 2000s countries “leapfrogged” landline telephones and adopted cell phones. Yet countries will only be able to leapfrog the fossil-fuel era if the climate equivalent of cell phones exist: if climate-friendly technologies are plentiful, useful, and price-competitive.
That’s not all it will take, of course. The world will have to phase down the production and consumption of fossil fuels, because the existence of climate-friendly technologies will not guarantee their use. Humanity may also have to create and enforce a strong moral taboo around burning fossil fuels, much in the same way that it has created a taboo around, say, child labor. But none of that can happen unless climate-friendly alternatives exist: Otherwise countries will ensure that they gain access to the energy that their development requires.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
The Trump administration has started to weaken the rules requiring cars and trucks to get more fuel-efficient every year.
In a press event on Wednesday in the Oval Office, flanked by advisors and some of the country’s top auto executives, President Trump declared that the old rules “forced automakers to build cars using expensive technologies that drove up costs, drove up prices, and made the car much worse.”
He said that the rules were part of the “green new scam” and that ditching them would save consumers some $1,000 every year. That framed the rollback as part of the president’s seeming pivot to affordability, which has happened since Democrats trounced Republicans in the November off-cycle elections.
That pivot remains belated and at least a little half-hearted: On Wednesday, Trump made no mention of dropping the auto tariffs that are raising imported car prices by perhaps $5,000 per vehicle, according to Cox Automotive. Ditching the fuel economy rules, too, could increase demand for gasoline and thus raise prices at the pump — although they remain fairly low right now, with the national average below $3 a gallon.
What’s more interesting — and worrying — is that the rules fit into the administration’s broader war on innovation in the American car and light-duty truck sector.
The United States essentially has two ways to regulate pollution from cars and light trucks: It can limit greenhouse gas emissions from new cars and trucks, and it can require the fuel economy from new vehicles to get a little better every year.
Trump is pulling screws and wires out of both of these systems. In the first category, he’s begun to unwind the Environmental Protection Agency’s limits on carbon pollution from cars and light duty trucks, which he termed an “EV mandate.” (The Biden-era rules sought to require about half of new car sales be electric by 2030, although hybrids could help meet that standard.) Trump is also trying to keep the EPA from ever regulating anything to do with carbon pollution again by going after the agency’s “Endangerment Finding” — a scientific assessment that greenhouse gases are dangerous to human wellbeing.
That’s only half of the president’s war on air pollution rules, though. Since the oil crises of the 1970s, the National Highway Traffic Safety Administration has regulated fuel economy for new vehicles under the Corporate Average Fuel Economy, or CAFE, standards. When these rules are binding, the agency can require new cars and trucks sold in the U.S. to get a little more fuel-efficient every year. The idea is that these rules help limit the country’s gasoline consumption, thus keeping a lid on oil prices and letting the whole economy run more efficiently.
President Trump’s signature tax law, the One Big Beautiful Bill Act, already eliminated the fines that automakers have to pay when they fail to meet the standard. That change, pushed by Senator Ted Cruz of Texas, effectively rendered the regulation toothless. But now Trump is weakening the rules just for good measure. (At the press conference on Wednesday, Cruz stood behind the president — and next to Jim Farley, the CEO of Ford.)
Under the new Trump proposal, automakers would need to achieve only an average of 34.5 miles per gallon in 2031. Under Biden’s proposal, they needed to hit 50 miles per gallon that year.
Those numbers, I should add, are somewhat deceptive — because of how CAFE standards are calculated, the headline number is 20% to 30% stricter than a real-world fuel economy number. In essence, that means the new Trump era rules will come out to a real-world mile-per-gallon number in the mid-to-high 20s. That will give automakers ample regulatory room to sell more inefficient and gas-guzzling sport utility vehicles and pickups, which remain more profitable than electric vehicles.
Which is not ideal for air pollution or the energy transition. But the real risk for the American automaking industry is not that Ford might churn out a few extra Escapes over the next several years. It’s that the Trump proposal would eliminate the ability for automakers to trade compliance credits to meet the rules. These credit markets — which allow manufacturers of gas guzzlers to redeem themselves by buying credits generated by cleaner cars — have been a valuable revenue source for new vehicle companies like Tesla, Lucid, and Rivian. The Trump proposal would cut off that revenue — and with it, one of the few remaining ways that automakers are cross-subsidizing EV innovation in the United States.
During his campaign, President Trump said that he wanted the “cleanest air.” That promise is looking as incorrect as his pledge to cut electricity costs in half within a year.
How will America’s largest grid deal with the influx of electricity demand? It has until the end of the year to figure things out.
As America’s largest electricity market was deliberating over how to reform the interconnection of data centers, its independent market monitor threw a regulatory grenade into the mix. Just before the Thanksgiving holiday, the monitor filed a complaint with federal regulators saying that PJM Interconnection, which spans from Washington, D.C. to Ohio, should simply stop connecting new large data centers that it doesn’t have the capacity to serve reliably.
The complaint is just the latest development in a months-long debate involving the electricity market, power producers, utilities, elected officials, environmental activists, and consumer advocates over how to connect the deluge data centers in PJM’s 13-state territory without further increasing consumer electricity prices.
The system has been pushed into crisis by skyrocketing capacity auction prices, in which generators get paid to ensure they’re available when demand spikes. Those capacity auction prices have been fueled by high-octane demand projections, with PJM’s summer peak forecasted to jump from 154 gigawatts to 210 gigawatts in a decade. The 2034-35 forecast jumped 17% in just a year.
Over the past two two capacity auctions, actual and forecast data center growth has been responsible for over $16.6 billion in new costs, according to PJM’s independent market monitor; by contrast, the previous year’s auction generated a mere $2.2 billion. This has translated directly to higher retail electricity prices, including 20% increases in some parts of PJM’s territory, like New Jersey. It has also generated concerns about reliability of the whole system.
PJM wants to reform how data centers interconnect before the next capacity auction in June, but its members committee was unable to come to an agreement on a recommendation to PJM’s board during a November meeting. There were a dozen proposals, including one from the monitor; like all the others, it failed to garner the necessary two-thirds majority vote to be adopted formally.
So the monitor took its ideas straight to the top.
The market monitor’s complaint to the Federal Energy Regulatory Commission tracks closely with its plan at the November meeting. “PJM is currently proposing to allow the interconnection of large new data center loads that it cannot serve reliably and that will require load curtailments (black outs) of the data centers or of other customers at times. That result is not consistent with the basic responsibility of PJM to maintain a reliable grid and is therefore not just and reasonable,” the filing said. “Interconnecting large new data center loads when adequate capacity is not available is not providing reliable service.”
A PJM spokesperson told me, “We are still reviewing the complaint and will reserve comment at this time.”
But can its board still get a plan to FERC and avoid another blowout capacity auction?
“PJM is going to make a filing in December, no matter what. They have to get these rules in place to get to that next capacity auction in June,” Jon Gordon, policy director at Advanced Energy United, told me. “That’s what this has been about from the get-go. Nothing is going to stop PJM from filling something.”
The PJM spokesperson confirmed to me that “the board intends to act on large load additions to the system and is expected to provide an indication of its next steps over the next few weeks.” But especially after the membership’s failure to make a unified recommendation, what that proposal will be remains unclear. That has been a source of agita for the organizations’ many stakeholders.
“The absence of an affirmative advisory recommendation from the Members Committee creates uncertainty as to what reforms PJM’s Board of Managers may submit to the Federal Energy Regulatory Commission (FERC), and when stakeholders can expect that submission,” analysts at ClearView Energy Partners wrote in a note to clients. In spite of PJM’s commitments, they warned that the process could “slip into January,” which would give FERC just enough time to process the submission before the next capacity auction.
One idea did attract a majority vote from PJM’s membership: Southern Maryland Electric Cooperative’s, which largely echoed the PJM board’s own plan with some amendments. That suggestion called for a “Price Responsive Demand” system, in which electricity customers would agree to reduce their usage when wholesale prices spike. The system would be voluntary, unlike an earlier PJM proposal, which foresaw forcing large customers to curtail their power. “The load elects to not take on a capacity obligation, therefore does not pay for capacity, and is required to reduce demand during stressed system conditions,” PJM explained in an update. The Southern Maryland plan tweaks the PRD system to adjust its pricing mechanism. but largely aligns with what PJM’s staff put forward.
“There’s almost no real difference between the PJM proposal and that Southern Maryland proposal,” Gordon told me.
That might please restive stakeholders, or at least be something PJM’s board could go forward with knowing that the balance of its voting membership agreed with something similar.
“We maintain our view that a final proposal could resemble the proposed solution package from PJM staff,” the ClearView note said. “We also think the Board could propose reforms to PJM’s PRD program. Indeed, as noted above, SMECO’s revisions to the service gained majority support.”
The PJM plan also included relatively uncontroversial reforms to load forecasting to cut down on duplicated requests and better share information, and an “expedited interconnection track” on which new, large-scale generation could be fast-tracked if it were signed off on by a state government “to expedite consideration of permitting and siting.”
Gordon said that the market monitor’s complaint could be read as the organization “desperately trying to get FERC to weigh in” on its side, even if PJM is more likely to go with something like its own staff-authored submission.
“The key aspect of the market monitor’s proposal was that PJM should not allow a data center to interconnect until there was enough generation to supply them,” Gordon explained. During the meeting preceding the vote, “PJM said they didn’t think they had the authority to deny someone interconnection.”
This dispute over whether the electricity system has an obligation to serve all customers has been the existential question making the debate about how to serve data centers extra angsty.
But PJM looks to be trying to sidestep that big question and nibble around the edges of reform.
“Everybody is really conflicted here,” Gordon told me. “They’re all about protecting consumers. They don’t want to see any more increases, obviously, and they want to keep the lights on. Of course, they also want data center developers in their states. It’s really hard to have all three.”
Atomic Canyon is set to announce the deal with the International Atomic Energy Agency.
Two years ago, Trey Lauderdale asked not what nuclear power could do for artificial intelligence, but what artificial intelligence could do for nuclear power.
The value of atomic power stations to provide the constant, zero-carbon electricity many data centers demand was well understood. What large language models could do to make building and operating reactors easier was less obvious. His startup, Atomic Canyon, made a first attempt at answering that by creating a program that could make the mountains of paper documents at the Diablo Canyon nuclear plant, California’s only remaining station, searchable. But Lauderdale was thinking bigger.
In September, Atomic Canyon inked a deal with the Idaho National Laboratory to start devising industry standards to test the capacity of AI software for nuclear projects, in much the same way each update to ChatGPT or Perplexity is benchmarked by the program’s ability to complete bar exams or medical tests. Now, the company’s effort is going global.
On Wednesday, Atomic Canyon is set to announce a partnership with the United Nations International Atomic Energy Agency to begin cataloging the United Nations nuclear watchdog’s data and laying the groundwork for global standards of how AI software can be used in the industry.
“We’re going to start building proof of concepts and models together, and we’re going to build a framework of what the opportunities and use cases are for AI,” Lauderdale, Atomic Canyon’s chief executive, told me on a call from his hotel room in Vienna, Austria, where the IAEA is headquartered.
The memorandum of understanding between the company and the UN agency is at an early stage, so it’s as yet unclear what international standards or guidelines could look like.
In the U.S., Atomic Canyon began making inroads earlier this year with a project backed by the Institute of Nuclear Power Operators, the Nuclear Energy Institute, and the Electric Power Research Institute to create a virtual assistant for nuclear workers.
Atomic Canyon isn’t the only company applying AI to nuclear power. Last month, nuclear giant Westinghouse unveiled new software it’s designing with Google to calculate ways to bring down the cost of key components in reactors by millions of dollars. The Nuclear Company, a startup developer that’s aiming to build fleets of reactors based on existing designs, announced a deal with the software behemoth Palantir to craft the software equivalent of what the companies described as an “Iron Man suit,” able to swiftly pull up regulatory and blueprint details for the engineers tasked with building new atomic power stations.
Lauderdale doesn’t see that as competition.
“All of that, I view as complementary,” he said.
“There is so much wood to chop in the nuclear power space, the amount of work from an administrative perspective regarding every inch of the nuclear supply chain, from how we design reactors to how we license reactors, how we regulate to how we do environmental reviews, how we construct them to how we maintain,” he added. “Every aspect of the nuclear power life cycle is going to be transformed. There’s no way one company alone could come in and say, we have a magical approach. We’re going to need multiple players.”
That Atomic Canyon is making inroads at the IAEA has the potential to significantly broaden the company’s reach. Unlike other energy sources, nuclear power is uniquely subject to international oversight as part of global efforts to prevent civilian atomic energy from bleeding over into weapons production.
The IAEA’s bylaws award particular agenda-setting powers to whatever country has the largest fleet of nuclear reactors. In the nearly seven decades since the agency’s founding, that nation has been the U.S. As such, the 30 other countries with nuclear power have largely aligned their regulations and approaches to the ones standardized in Washington. When the U.S. artificially capped the enrichment levels of traditional reactor fuel at 5%, for example, the rest of the world followed.
That could soon change, however, as China’s breakneck deployment of new reactors looks poised to vault the country ahead of the U.S. sometime in the next decade. It wouldn’t just be a symbolic milestone. China’s emergence as the world’s preeminent nuclear-powered nation would likely come with Beijing’s increased influence over other countries’ atomic energy programs. As it is, China is preparing to start exporting its reactors overseas.
The role electricity demand from the data centers powering the AI boom has played in spurring calls for new reactors is undeniable. But if AI turns out to have as big an impact on nuclear operations as Lauderdale predicts, an American company helping to establish the global guidelines could help cement U.S. influence over a potentially major new factor in how the industry works for years, if not decades to come.