You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
According to IPCC author Andy Reisinger, “net zero by 2050” misses some key points.

Tackling climate change is a complex puzzle. Hitting internationally agreed upon targets to limit warming requires the world to reduce multiple types of greenhouse gases from a multiplicity of sources on diverse timelines and across varying levels of responsibility and control by individual, corporate, and state actors. It’s no surprise the catchphrase “net zero by 2050” has taken off.
Various initiatives have sprung up to distill this complexity for businesses and governments who want to do (or say they are doing) what the “science says” is necessary. The nonprofit Science Based Targets initiative, for example, develops standard roadmaps for companies to follow to act “in line with climate science.” The groups also vets corporate plans and deems them to either be “science based” or not. Though entirely voluntary, SBTi’s approval has become a nearly mandatory mark of credibility. The group has validated the plans of more than 5,500 companies with more than $46 trillion in market capitalization — nearly half of the global economy.
But in a commentary published in the journal Nature last week, a group of Intergovernmental Panel on Climate Change experts argue that SBTi and other supposedly “science based” target-setting efforts misconstrue the science and are laden with value judgments. By striving to create straightforward, universal rules, they flatten more nuanced considerations of which emissions must be reduced, by whom and by when.
“We are arguing that those companies and countries that are best resourced, have the highest capacity to act, and have the highest responsibility for historical emissions, probably need to go a lot further than the global average,” Andy Reisinger, the lead author of the piece, told me.
In response to the paper, SBTi told me it “welcomes debate,” and that “robust debate is essential to accelerate corporate ambition and climate action.” The group is currently in the process of reviewing its Net-Zero Standard and remains “committed to refining our approaches to ensure they are effective in helping corporates to drive the urgent emissions reductions needed to combat the climate crisis.”
The commentary comes as SBTi’s reputation is already on shaky ground. In April, its board appeared to go rogue and said that the group would loosen its standards for the use of carbon offsets. The announcement was met first with surprise and later with fierce protest from the nonprofit’s staff and technical council, who had not been consulted. Environmental groups accused SBTi of taking the “science” out of its targets. The board later walked back its statement, saying that no change had been made to the rules, yet.
But interestingly enough, the new Nature commentary argues that SBTi’s board was actually on the right track. I spoke to Reisinger about this, and some of the other ways he thinks science based targets “miss the mark.”
Reisinger, who’s from New Zealand, was the vice-chair of the United Nations Intergovernmental Panel on Climate Change’s mega-report on climate mitigation from 2022. I caught him just as he had arrived in Sofia, Bulgaria, for a plenary that will determine the timeline for the next big batch of UN science reports. Our conversation has been edited for length and clarity.
Was there something in particular that inspired you to write this? Or were you just noticing the same issues over and over again?
There were probably several things. One is a confusion that’s quite prevalent between net zero CO2 emissions and net zero greenhouse gas emissions. The IPCC makes clear that to limit warming at any level, you need to reach net zero CO2 emissions, because it’s a long lived greenhouse gas and the warming effect accumulates in the atmosphere over time. You need deep reductions of shorter lived greenhouse gases like methane, but they don’t necessarily have to reach zero. And yet, a lot of people claim that the IPCC tells us that we have to reach net zero greenhouse gas emissions by 2050, which is simply not the case.
Of course, you can claim that there’s nothing wrong, surely, with going to net zero greenhouse gas emissions because that’s more ambitious. But there’s two problems with that. One is, if you want to use science, you have to get the science correct. You can’t just make it up and still claim to be science-based. Secondly, it creates a very uneven playing field between those who mainly have CO2 emissions and those who have non-CO2 emissions as a significant part of their emissions portfolio — which often are much harder to reduce.
Can you give an example of what you mean by that?
You can rapidly decarbonize and actually approach close to zero emissions in your energy generation, if that’s your dominant source of emissions. There are viable solutions to generate energy with very low or no emissions — renewables, predominantly. Nuclear in some circumstances.
But to give you another example, in Australia, the Meat and Livestock Association, they set a net zero target, but they subsequently realized it’s much harder to achieve it because methane emissions from livestock are very, very difficult to reduce entirely. Of course you can say, we’ll no longer produce beef. But if you’re the Cattle Association, you’re not going to rapidly morph into producing a different type of meat product. And so in that case, achieving net zero is much more challenging. Of course, you can’t lean back and say, Oh, it’s too difficult for us, therefore we shouldn’t try.
I want to walk through the three main points to your argument for why science-based targets “miss the mark.” I think we’ve just covered the first. The second is that these initiatives put everyone on the same timeline and subject them to the same rules, which you say could actually slow emissions reductions in the near term. Can you explain that?
The Science Based Targets initiative in particular, but also other initiatives that provide benchmarks for companies, tend to want to limit the use of offsets, where a company finances emission reductions elsewhere and claims them to achieve their own targets. And there’s very good reasons for that, because there’s a lot of greenwashing going on. Some offsets have very low integrity.
At the same time, if you set a universal rule that all offsets are bad and unscientific, you’re making a major mistake. Offsets are a way of generating financial flows towards those with less intrinsic capacity to reduce their emissions. So by making companies focus only on their own reductions, you basically cut off financial flows that could stimulate emission reductions elsewhere or generate carbon dioxide removals. Then you’re creating a problem for later on in the future, when we desperately need more carbon dioxide removal and haven’t built up the infrastructure or the accountability systems that would allow that.
As you know, there’s a lot of controversy about this right now. There are many scientists who disagree with you and don’t want the Science Based Targets initiative to loosen its rules for using offsets. Why is there this split in the scientific community about this?
I think the issue arises when you think that net zero by 2050 is the unquestioned target. But if you challenge yourself to say, well net zero by 2050 might be entirely unambitious for you, you have to reduce your own emissions and invest in offsets to go far beyond net zero by 2050 — then you might get a different reaction to it.
I think everybody would agree that if offsets are being used instead of efforts to reduce emissions that are under a company’s direct control, and they can be reduced, then offsets are a really bad idea. And of course, low integrity offsets are always a bad idea. But the solution to the risk of low integrity cannot be to walk away from it entirely, because otherwise you’ve further reduced incentives to actually generate accountability mechanisms. So the challenge would be to drive emission reductions at the company level, and on top of that, create incentives to engage in offsets, to increase financial flows to carbon dioxide removal — both permanent and inherently non permanent — because we will need it.
My understanding is that groups like SBTi and some of these other carbon market integrity initiatives agree with what you’ve just said — even if they don’t support offsetting emissions, they do support buying carbon credits to go above and beyond emissions targets. They are already advocating for that, even if they’re not necessarily creating the incentives for it.
I mean, that’s certainly a move in the right direction. But it’s creating this artificial distinction between what the science tells you, the “science based target,” and then the voluntary effort beyond that. Whereas I think it has to become an obligation. So it’s not a distinction between, here’s what the science says, and here’s where your voluntary, generous, additional contribution to global efforts might go. It is a much more integrated package of actions.
I think we’re starting to get at the third point that your commentary makes, which is about how these so-called science-based targets are inequitable. How does that work?
There’s a rich literature on differentiating targets at the country level based on responsibility for warming, or a capacity-based approach that says, if you’re rich and we have a global problem, you have to use your wealth to help solve the global problem. Most countries don’t because the more developed you are, the more unpleasant the consequences are.
At the company level, SBTi, for example, tends to use the global or regional or sectoral average rate of reductions as the benchmark that an individual company has to follow. But not every company is average, and systems transitions follow far more complex dynamics. Some incumbents have to reduce emissions much more rapidly, or they go out of business in order to create space for innovators to come in, whose emissions might rise in the near term before they go down, but with new technologies that allow deeper reductions in the long term. Assuming a uniform rate of reduction levels out all those differences.
It’s far more challenging to translate equity into meaningful metrics at the company level. But our core argument is, just because it’s hard, that cannot mean let’s not do it. So how can we challenge companies to disclose their thinking, their justification about what is good enough?
The Science Based Targets initiative formed because previously, companies were coming up with their own interpretations of the science, and there was no easy way to assess whether these plans were legitimate. Can you really imagine a middle ground where there is still some sort of policing mechanism to say whether a given corporate target is good enough?
That’s what we try to sketch as a vision, but it certainly won’t be easy. I also want to emphasize that we’re not trying to attack SBTi in principle. It’s done a world of good. And we certainly don’t want to throw the baby out with the bathwater to just cancel the idea. It’s more to use it as a starting point. As we say in our paper, you can almost take an SBTi target as the definition of what is not sufficient if you’re a company located in the Global North or a multinational company with high access to resources — human, technology and financial.
It was a wild west before SBTi and we’re not saying let’s go back to the wild west. We’re saying the pendulum might have swung too far to a universal rule that applies to everybody, but therefore applies to nobody.
There’s one especially scathing line in this commentary. You write that these generic rules “result in a pseudo-club that inadequately challenges its self-selected members while setting prohibitive expectations for those with less than average capacity.” We’ve already talked about the second half of this statement, but what do you mean by pseudo-club?
You write a science based target as a badge of achievement, a badge of honor on your company profile, assuming that therefore you have done all that can be expected of you when it comes to climate change. Most of the companies that have adopted science based targets are located in the Global North, or operate on a multinational basis and have therefore quite similar capacity. If that’s what we’re achieving — and then there’s a large number of companies that can’t possibly, under their current capacity, set science-based targets because they simply don’t have the resources — then collectively, we will fail. Science cannot tell you whether you have done as much as you could be doing. If we let the simplistic rules dominate the conversation, then we’re not going to be as ambitious as we need to be.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Deep Fission says that building small reactors underground is both safer and cheaper. Others have their doubts.
In 1981, two years after the accident at Three Mile Island sent fears over the potential risks of atomic energy skyrocketing, Westinghouse looked into what it would take to build a reactor 2,100 feet underground, insulating its radioactive material in an envelope of dirt. The United States’ leading reactor developer wasn’t responsible for the plant that partially melted down in Pennsylvania, but the company was grappling with new regulations that came as a result of the incident. The concept went nowhere.
More than a decade later, the esteemed nuclear physicist Edward Teller resurfaced the idea in a 1995 paper that once again attracted little actual interest from the industry — that is, until 2006, when Lowell Wood, a physicist at the Lawrence Livermore National Laboratory, proposed building an underground reactor to Bill Gates, who considered but ultimately abandoned the design at his nuclear startup, TerraPower.
Now, at last, one company is working to make buried reactors a reality.
Deep Fission proposes digging boreholes 30 inches in diameter and about a mile deep to house each of its 15-megawatt reactors. And it’s making progress. In August, the Department of Energy selected Deep Fission as one of the 10 companies enrolled in the agency’s new reactor pilot program, meant to help next-generation startups split their first atoms by July. In September, the company announced a $30 million reverse merger deal with a blank check firm to make its stock market debut on the lesser-known exchange OTCQB. Last month, Deep Fission chose an industrial park in a rural stretch of southeastern Kansas as the site of its first power plant.
Based in Berkeley, California, the one-time hub of the West Coast’s fading anti-nuclear movement, the company says its design is meant to save money on above-ground infrastructure by letting geology do the work to add “layers of natural containment” to “enhance safety.” By eliminating much of that expensive concrete and steel dome that encases the reactor on the surface, the startup estimates “that our approach removes up to 80% of the construction cost, one of the biggest barriers for nuclear, and enables operation within six months of breaking ground.”
“The primary benefit of placing a reactor a mile deep is cost and speed,” Chloe Frader, Deep Fission’s vice president of strategic affairs, told me. “By using the natural pressure and containment of the Earth, we eliminate the need for the massive, above-ground structures that make traditional nuclear expensive and slow to build.”
“Nuclear power is already the safest energy source in the world. Period,” she said. “Our underground design doesn’t exist because nuclear is unsafe, it exists because we can make something that is already extremely safe even safer, simpler, and more affordable.”
But gaining government recognition, going public, and picking a location for a first power plant may prove the easy part. Convincing others in the industry that its concept is a radical plan to cut construction costs rather than allay the public’s often-outsize fear of a meltdown has turned out to be difficult, to say nothing of what actually building its reactors will entail.
Despite the company’s recent progress, I struggled to find anyone who didn’t have a financial stake in Deep Fission willing to make the case for its buried reactors.
Deep Fission is “solving a problem that doesn't actually exist,” Seth Grae, the chief executive of the nuclear fuel company Lightbridge, told me. In the nearly seven decades since fission started producing commercial electrons on the U.S. grid, no confirmed death has ever come from radiation at a nuclear power station.
“You’re trying to solve a political problem that has literally never hurt anyone in the entire history of our country since this industry started,” he said. “You’re also making your reactors more expensive. In nuclear, as in a lot of other projects, when you build tall or dig deep or lift big and heavy, those steps make the projects much more expensive.”
Frader told me that subterranean rock structures would serve “as natural containment, which also enhances safety.” That’s true to some extent. Making use of existing formations “could simplify surface infrastructure and streamline construction,” Leslie Dewan, a nuclear engineer who previously led a next-generation small modular reactor startup, told IEEE Spectrum.
If everything pans out, that could justify Deep Fission’s estimate that its levelized cost of electricity — not the most dependable metric, but one frequently used by solar and wind advocates — would be between $50 and $70 per megawatt-hour, lower than other SMR developers’ projections. But that’s only if a lot of things go right.
“A design that relies on the surrounding geology for safety and containment needs to demonstrate a deep understanding of subsurface behavior, including the stability of the rock formations, groundwater movement, heat transfer, and long-term site stability,” Dewan said. “There are also operational considerations around monitoring, access, and decommissioning. But none of these are necessarily showstoppers: They’re all areas that can be addressed through rigorous engineering and thoughtful planning.”
As anyone in the geothermal industry can tell you, digging a borehole costs a lot of money. Drilling equipment comes at a high price. Underground geology complicates a route going down one mile straight. And not every hole that’s started ends up panning out, meaning the process must be repeated over and over again.
For Deep Fission, drilling lots of holes is part of the process. Given the size of its reactor, to reach a gigawatt — the output of one of Westinghouse’s flagship AP1000s, the only new type of commercial reactor successfully built from scratch in the U.S. this century — Deep Fission would need to build 67 of its own microreactors. That’s a lot of digging, considering that the diameters of the company’s boreholes are on average nearly three times wider than those drilled for harvesting natural gas or geothermal.
The company isn’t just distinguished by its unique approach. Deep Fission has a sister company, Deep Isolation, that proposes burying spent nuclear fuel in boreholes. In April, the two startups officially partnered in a deal that “enables Deep Fission to offer an end-to-end solution that includes both energy generation and long-term waste management.”
In theory, that combination could offer the company a greater social license among environmental skeptics who take issue with the waste generated from a nuclear plant.
In 1982, Congress passed a landmark law making the federal government responsible for the disposal of all spent fuel and high-level radioactive waste in the country. The plan centered on building a giant repository to permanently entomb the material where it could remain undisturbed for thousands of years. The law designated Yucca Mountain, a rural site in southwestern Nevada near the California border, as the exclusive location for the debut repository.
Construction took years to start. After initial work got underway during the Bush administration, Obama took office and promptly slashed all funding for the effort, which was opposed by then-Senate Majority Leader Harry Reid of Nevada; the nonpartisan Government Accountability Office clocked the move as a purely political decision. Regardless of the motivation, the cancellation threw the U.S. waste disposal strategy into limbo because the law requires the federal government to complete Yucca Mountain before moving on to other potential storage sites. Until that law changes, the U.S. effort to find a permanent solution to nuclear waste remains in limbo, with virtually all the spent fuel accumulated over the years kept in intermediate storage vessels on site at power plants.
Finland finished work on the world’s first such repository in 2024. Sweden and Canada are considering similar facilities. But in the U.S., the industry is moving beyond seeing its spent fuel as waste, as more companies look to start up a recycling industry akin to those in Russia, Japan, and France to reprocess old uranium into new pellets for new reactors. President Donald Trump has backed the effort. The energy still stored in nuclear waste just in this country is sufficient to power the U.S. for more than a century.
Even if Americans want an answer to the nuclear waste problem, there isn’t much evidence to suggest they want to see the material stored near their homes. New Mexico, for example, passed a law barring construction of an intermediate storage site in 2023. Texas attempted to do the same, but the Supreme Court found the state’s legislation to be in violation of the federal jurisdiction over waste.
While Deep Fission’s reactors would be “so far removed from the biosphere” that the company seems to think the NRC will just “hand out licenses and the public won’t worry,” said Nick Touran, a veteran engineer whose consultancy, What Is Nuclear, catalogs reactor designs and documents from the industry’s history, “the assumption that it’ll be easy and cheap to site and license this kind of facility is going to be found to be mistaken,” he told me.
The problem with nuclear power isn’t the technology, Brett Rampal, a nuclear expert at the consultancy Veriten, told me. “Nuclear has not been suffering from a technological issue. The technology works great. People do amazing things with it, from curing cancer to all kinds of almost magical energy production,” he told me. “What we need is business models and deployment models.”
Digging a 30-inch borehole a mile deep would be expensive enough, but Rampal also pointed out that lining those shafts with nuclear-grade steel and equipping them with cables would likely pencil out to a higher price than building an AP1000 — but with one one-hundredth of the power output.
Deep Fission insists that isn’t the case, and that the natural geology “removes the need for complex, costly pressure vessels and large engineered structures” on the surface.
“We still use steel and engineered components where necessary, but the total material requirements are a fraction of those used in a traditional large-scale plant,” Frader said.
Ultimately, burying reactors is about quieting concerns that should be debunked head on, Emmet Penney, a historian of the industry and a senior fellow at the Foundation for American Innovation, a right-leaning think tank that advocates building more reactors in the U.S., told me.
“Investors need to wake up and realize that nuclear is one of the safest power sources on the planet,” Penney said. “Otherwise, goofy companies will continue to snow them with slick slide decks about solving non-issues.”
On energy efficiency rules, Chinese nuclear, and Japan’s first offshore wind
Current conditions: Warm air headed northward up the East Coast is set to collide with cold air headed southward over the Great Lakes and Northeast, bringing snowfall followed by higher temperatures later in the week • A cold front is stirring up a dense fog in northwest India • Unusually frigid Arctic air in Europe is causing temperatures across northwest Africa to plunge to double-digit degrees below seasonal norms, with Algiers at just over 50 degrees Fahrenheit this week.

Oil prices largely fell throughout 2025, capping off December at their lowest level all year. Spot market prices for Brent crude, the leading global benchmark for oil, dropped to $63 per barrel last month. The reason, according to the latest analysis of the full year by the Energy Information Administration, is oversupply in the market. China’s push to fill its storage tanks kept prices from declining further. Israel’s June 13 strikes on Iran and attacks on oil infrastructure between Russia and Ukraine briefly raised prices throughout the year. But the year-end average price still came in at $69 per barrel, the lowest since 2020, even when adjusted for inflation.

The price drop bodes poorly for reviving Venezuela’s oil industry in the wake of the U.S. raid on Caracas and arrest of the South American country’s President Nicolás Maduro. At such low levels, investments in new infrastructure are difficult to justify. “This is a moment where there’s oversupply,” oil analyst Rory Johnston told my colleague Matthew Zeitlin yesterday. “Prices are down. It’s not the moment that you’re like, I’m going to go on a lark and invest in Venezuela.”
The Energy Department granted a Texas company known for recycling defunct tools from oil and gas drilling an $11.5 million grant to fund an expansion of its existing facility in a rural county between San Antonio and Dallas. The company, Amermin, said the funding will allow it to increase its output of tungsten carbide by 300%, “reducing our reliance on foreign nations like China, which produces 83%” of the world’s supply of the metal used in all kinds of defense, energy, and hardware applications. “Our country cannot afford to rely on our adversaries for the resources that power our energy industry,” Representative August Pfluger, a Texas Republican, said in a statement. “This investment strengthens our district’s role in American energy leadership while providing good paying jobs to Texas families.”
That wasn’t the agency’s only big funding announcement. The Energy Department gave out $2.7 billion in contracts for enriched uranium, with $900 million each to Maryland-based Centrus Energy, the French producer Orano, and the California-headquartered General Matter. “President Trump is catalyzing a resurgence in the nation’s nuclear energy sector to strengthen American security and prosperity,” Secretary of Energy Chris Wright said in a press release. “Today’s awards show that this Administration is committed to restoring a secure domestic nuclear fuel supply chain capable of producing the nuclear fuels needed to power the reactors of today and the advanced reactors of tomorrow.”
Low-income households in the United States pay roughly 30% more for energy per square foot than households who haven’t faced trouble paying for electricity and heat in the past, federal data shows. Part of the problem is that the national efficiency standards for one of the most affordable types of housing in the nation, manufactured homes, haven’t been updated since 1994. Congress finally passed a law in 2007 directing the Department of Energy to raise standards for insulation, and in 2022, the Biden administration proposed new rules to increase insulation and reduce air leaks. But the regulations had yet to take effect when President Donald Trump returned to office last year. Now the House of Representatives is prepared to vote on legislation to nullify the rules outright, preserving the standards set more than three decades ago. The House Committee on Rules is set to vote on advancing the bill as early as Tuesday night, with a full floor vote likely later in the week. “You’re just locking in higher bills for years to come if you give manufacturers this green light to build the homes with minimal insulation,” Mark Kresowik, senior policy director of the American Council for an Energy-Efficient Economy, told me.
Sign up to receive Heatmap AM in your inbox every morning:
The newest reactor at the Zhangzhou nuclear station in Fujian Province has officially started up commercial operation as China’s buildout of new atomic power infrastructure picks up pace this year. The 1,136-megawatt Hualong One represents China’s leading indigenous reactor design. Where once Beijing preferred the top U.S. technology for large-scale reactors, the Westinghouse AP1000, the Hualong One’s entirely domestic supply chain and design that borrows from the American standard has made China’s own model the new leader.
In a sign of just how many reactors China is building — at least 35 underway nationwide, as I noted in yesterday’s newsletter — the country started construction on two more the same week the latest Hualong One came online. World Nuclear News reported that first concrete has been poured for a pair of CAP1000 reactors, the official Chinese version of the Westinghouse AP1000, at two separate plants in southern China.
Back in October, when Japan elected Sanae Takaichi as its first female prime minister, I told you about how the arch-conservative leader of the Liberal Democratic Party planned to refocus the country’s energy plans on reviving the nuclear industry. But don’t count out offshore wind. Unlike Europe’s North Sea or the American East Coast, the sharp continental drop in Japan’s ocean makes rooting giant turbines to the sea floor impossible along much of its shoreline. But the Goto Floating Wind Farm — employing floating technology under consideration on the U.S. West Coast, too — announced the start of commercial operations this week, pumping nearly 17 megawatts of power onto the Japanese grid. Japanese officials last year raised the country’s goal for installed capacity of offshore wind to 10 gigawatts by 2030 and 45 gigawatts by 2040, Power magazine noted, so the industry still has a long way to go.
Beavers may be the trick to heal nature’s burn scars after a wildfire. A team of scientists at the U.S. Forest Service and Colorado State University are building fake beaver dams in scorched areas to study how wetlands created by the dams impact the restoration of the ecosystem and water quality after a blaze. “It’s kind of a brave new world for us with this type of work,” Tim Fegel, a doctoral candidate at Colorado State, who led the research, said in a press release.
Rob talks about the removal of Venezuela’s Nicolás Maduro with Commodity Context’s Rory Johnston.
Over the weekend, the U.S. military entered Venezuela and captured its president, Nicolás Maduro, and his wife. Maduro will now face drug and gun charges in New York, and some members of the Trump administration have described the operation as a law enforcement mission.
President Donald Trump has taken a different tack. He has justified the operation by asserting that America is going to “take over” Venezuela’s oil reserves, even suggesting that oil companies might foot the bill for the broader occupation and rebuilding effort. Trump officials have told oil companies that the U.S. might not help them recover lost assets unless they fund the American effort now, according to Politico.
Such a move seems openly imperialistic, ill-advised, and unethical — to say the least. But is it even possible? On this week’s episode of Shift Key, Rob talks to Rory Johnston, a Toronto-based oil markets analyst and the founder of Commodity Context. They discuss the current status of the Venezuelan oil industry, what a rebuilding effort would cost, and whether a reopened Venezuelan oil industry could change U.S. energy politics — or even, as some fear, bring about a new age of cheap fossil fuels.
Shift Key is hosted by Robinson Meyer, the founding executive editor of Heatmap, and Jesse Jenkins, a professor of energy systems engineering at Princeton University. Jesse is off this week.
Subscribe to “Shift Key” and find this episode on Apple Podcasts, Spotify, Amazon, or wherever you get your podcasts.
You can also add the show’s RSS feed to your podcast app to follow us directly.
Here is an excerpt from our conversation:
Robinson Meyer: First of all, does Venezuela have the world’s largest hydrocarbon reserves — like, proven hydrocarbon reserves? And number two, let’s say that Trump has made some backdoor deal with the existing regime, that these existing issues are ironed ou to actually use those reserves. What kind of investment are we talking about on that end?
Rory Johnston: The mucky answer to this largest reserve question is, there’s lots of debate. I will say there’s a reasonable claim that at one point Venezuela — Venezuela has a lot of oil. Let’s just say it that way: Venezuela has a lot of oil, particularly the Orinoco Belt, which, again, similar to the oil sands we’re talking about —
Meyer: This is the Orinoco flow. We’re going to call this the Orinoco flow question.
Johnston: Yeah, exactly, that. Similar to the Canadian oil sands, we’re talking about more than a trillion barrels of oil in place, the actual resource in the ground. But then from there you get to this question of what is technically recoverable. Then from there, what is economically recoverable? The explosion in, again, both Venezuelan and Canadian reserve estimates occurred during that massive boom in oil prices in the mid-2000s. And that created the justification for booking those as reserves rather than just resources.
So I think that there is ample — in the same way, like, Russia and the United States don’t actually have super impressive-looking reserves on paper, but they do a lot with them, and I think in actuality that matters a lot more than the amount of technical reserves you have in the ground. Because as we’ve seen, Venezuela hasn’t been able to do much with those reserves.
So in order to, how to actually get that operating, this is where we get back to the — we’re talking tens, hundreds of billions of dollars, and a lot of time. And these companies are not going to do that without seeing a track record of whatever government replaces the current. The current vice president, his acting president — which I should also note, vice president and oil minister, which I think is particularly relevant here — so I think there’s lots that needs to happen. But companies are not going to trip over themselves to expose themselves to this risk. We still don’t know what the future is going to look like for Venezuela.
Mentioned:
The 4 Things Standing Between the U.S. and Venezuela’s Oil
Trump admin sends tough private message to oil companies on Venezuela
Previously on Shift Key: The Trump Policy That Would Be Really Bad for Oil Companies
This episode of Shift Key is sponsored by …
Heatmap Pro brings all of our research, reporting, and insights down to the local level. The software platform tracks all local opposition to clean energy and data centers, forecasts community sentiment, and guides data-driven engagement campaigns. Book a demo today to see the premier intelligence platform for project permitting and community engagement.
Music for Shift Key is by Adam Kromelow.