You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
Over a dozen methane satellites are now circling the Earth — and more are on the way.

On Monday afternoon, a satellite the size of a washing machine hitched a ride on a SpaceX rocket and was launched into orbit. MethaneSAT, as the new satellite is called, is the latest to join more than a dozen other instruments currently circling the Earth monitoring emissions of the ultra-powerful greenhouse gas methane. But it won’t be the last. Over the next several months, at least two additional methane-detecting satellites from the U.S. and Japan are scheduled to join the fleet.
There’s a joke among scientists that there are so many methane-detecting satellites in space that they are reducing global warming — not just by providing essential data about emissions, but by blocking radiation from the sun.
So why do we keep launching more?
Despite the small army of probes in orbit, and an increasingly large fleet of methane-detecting planes and drones closer to the ground, our ability to identify where methane is leaking into the atmosphere is still far too limited. Like carbon dioxide, sources of methane around the world are numerous and diffuse. They can be natural, like wetlands and oceans, or man-made, like decomposing manure on farms, rotting waste in landfills, and leaks from oil and gas operations.
There are big, unanswered questions about methane, about which sources are driving the most emissions, and consequently, about tackling climate change, that scientists say MethaneSAT will help solve. But even then, some say we’ll need to launch even more instruments into space to really get to the bottom of it all.
Measuring methane from space only began in 2009 with the launch of the Greenhouse Gases Observing Satellite, or GOSAT, by Japan’s Aerospace Exploration Agency. Previously, most of the world’s methane detectors were on the ground in North America. GOSAT enabled scientists to develop a more geographically diverse understanding of major sources of methane to the atmosphere.
Soon after, the Environmental Defense Fund, which led the development of MethaneSAT, began campaigning for better data on methane emissions. Through its own, on-the-ground measurements, the group discovered that the Environmental Protection Agency’s estimates of leaks from U.S. oil and gas operations were totally off. EDF took this as a call to action. Because methane has such a strong warming effect, but also breaks down after about a decade in the atmosphere, curbing methane emissions can slow warming in the near-term.
“Some call it the low hanging fruit,” Steven Hamburg, the chief scientist at EDF leading the MethaneSAT project, said during a press conference on Friday. “I like to call it the fruit lying on the ground. We can really reduce those emissions and we can do it rapidly and see the benefits.”
But in order to do that, we need a much better picture than what GOSAT or other satellites like it can provide.
In the years since GOSAT launched, the field of methane monitoring has exploded. Today, there are two broad categories of methane instruments in space. Area flux mappers, like GOSAT, take global snapshots. They can show where methane concentrations are generally higher, and even identify exceptionally large leaks — so-called “ultra-emitters.” But the vast majority of leaks, big and small, are invisible to these instruments. Each pixel in a GOSAT image is 10 kilometers wide. Most of the time, there’s no way to zoom into the picture and see which facilities are responsible.

Point source imagers, on the other hand, take much smaller photos that have much finer resolution, with pixel sizes down to just a few meters wide. That means they provide geographically limited data — they have to be programmed to aim their lenses at very specific targets. But within each image is much more actionable data.
For example, GHGSat, a private company based in Canada, operates a constellation of 12 point-source satellites, each one about the size of a microwave oven. Oil and gas companies and government agencies pay GHGSat to help them identify facilities that are leaking. Jean-Francois Gauthier, the director of business development at GHGSat, told me that each image taken by one of their satellites is 12 kilometers wide, but the resolution for each pixel is 25 meters. A snapshot of the Permian Basin, a major oil and gas producing region in Texas, might contain hundreds of oil and gas wells, owned by a multitude of companies, but GHGSat can tell them apart and assign responsibility.
“We’ll see five, 10, 15, 20 different sites emitting at the same time and you can differentiate between them,” said Gauthier. “You can see them very distinctly on the map and be able to say, alright, that’s an unlit flare, and you can tell which company it is, too.” Similarly, GHGSat can look at a sprawling petrochemical complex and identify the exact tank or pipe that has sprung a leak.
But between this extremely wide-angle lens, and the many finely-tuned instruments pointing at specific targets, there’s a gap. “It might seem like there’s a lot of instruments in space, but we don’t have the kind of coverage that we need yet, believe it or not,” Andrew Thorpe, a research technologist at NASA’s Jet Propulsion Laboratory told me. He has been working with the nonprofit Carbon Mapper on a new constellation of point source imagers, the first of which is supposed to launch later this year.
The reason why we don’t have enough coverage has to do with the size of the existing images, their resolution, and the amount of time it takes to get them. One of the challenges, Thorpe said, is that it’s very hard to get a continuous picture of any given leak. Oil and gas equipment can spring leaks at random. They can leak continuously or intermittently. If you’re just getting a snapshot every few weeks, you may not be able to tell how long a leak lasted, or you might miss a short but significant plume. Meanwhile, oil and gas fields are also changing on a weekly basis, Joost de Gouw, an atmospheric chemist at the University of Colorado, Boulder, told me. New wells are being drilled in new places — places those point-source imagers may not be looking at.
“There’s a lot of potential to miss emissions because we’re not looking,” he said. “If you combine that with clouds — clouds can obscure a lot of our observations — there are still going to be a lot of times when we’re not actually seeing the methane emissions.”
De Gouw hopes MethaneSAT will help resolve one of the big debates about methane leaks. Between the millions of sites that release small amounts of methane all the time, and the handful of sites that exhale massive plumes infrequently, which is worse? What fraction of the total do those bigger emitters represent?
Paul Palmer, a professor at the University of Edinburgh who studies the Earth’s atmospheric composition, is hopeful that it will help pull together a more comprehensive picture of what’s driving changes in the atmosphere. Around the turn of the century, methane levels pretty much leveled off, he said. But then, around 2007, they started to grow again, and have since accelerated. Scientists have reached different conclusions about why.
“There’s lots of controversy about what the big drivers are,” Palmer told me. Some think it’s related to oil and gas production increasing. Others — and he’s in this camp — think it’s related to warming wetlands. “Anything that helps us would be great.”
MethaneSAT sits somewhere between the global mappers and point source imagers. It will take larger images than GHGSat, each one 200 kilometers wide, which means it will be able to cover more ground in a single day. Those images will also contain finer detail about leaks than GOSAT, but they won’t necessarily be able to identify exactly which facilities the smaller leaks are coming from. Also, unlike with GHGSat, MethaneSAT’s data will be freely available to the public.
EDF, which raised $88 million for the project and spent nearly a decade working on it, says that one of MethaneSAT’s main strengths will be to provide much more accurate basin-level emissions estimates. That means it will enable researchers to track the emissions of the entire Permian Basin over time, and compare it with other oil and gas fields in the U.S. and abroad. Many countries and companies are making pledges to reduce their emissions, and MethaneSAT will provide data on a relevant scale that can help track progress, Maryann Sargent, a senior project scientist at Harvard University who has been working with EDF on MethaneSAT, told me.

It could also help the Environmental Protection Agency understand whether its new methane regulations are working. It could help with the development of new standards for natural gas being imported into Europe. At the very least, it will help oil and gas buyers differentiate between products associated with higher or lower methane intensities. It will also enable fossil fuel companies who measure their own methane emissions to compare their performance to regional averages.
MethaneSAT won’t be able to look at every source of methane emissions around the world. The project is limited by how much data it can send back to Earth, so it has to be strategic. Sargent said they are limiting data collection to 30 targets per day, and in the near term, those will mostly be oil and gas producing regions. They aim to map emissions from 80% of global oil and gas production in the first year. The outcome could be revolutionary.
“We can look at the entire sector with high precision and track those emissions, quantify them and track them over time. That’s a first for empirical data for any sector, for any greenhouse gas, full stop,” Hamburg told reporters on Friday.
But this still won’t be enough, said Thorpe of NASA. He wants to see the next generation of instruments start to look more closely at natural sources of emissions, like wetlands. “These types of emissions are really, really important and very poorly understood,” he said. “So I think there’s a heck of a lot of potential to work towards the sectors that have been really hard to do with current technologies.”
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
The surge in electricity demand from data centers is making innovation a necessity.
Electric utilities aren’t exactly known as innovators. Until recently, that caution seemed perfectly logical — arguably even preferable. If the entity responsible for keeping the lights on and critical services running decides to try out some shiny new tech that fails, heating, cooling, medical equipment, and emergency systems will all trip offline. People could die.
“It’s a very conservative culture for all the right reasons,” Pradeep Tagare, a vice president at the utility National Grid and the head of its corporate venture fund, National Grid Partners, told me. “You really can’t follow the Silicon Valley mantra of move fast, break things. You are not allowed to break things, period.”
But with artificial intelligence-driven load growth booming, customer bills climbing, and the interconnection queue stubbornly backlogged, utilities now face little choice but to do things differently. The West Coast’s Pacific Gas and Electric Company now has a dedicated grid-innovation team of about 60 people; North Carolina-based utility Duke Energy operates an emerging technologies office; and National Grid, which serves U.S. customers in the Northeast, has invested in about 50 startups to date. Some 64% of utilities have expanded their innovation budgets in the past year, according to research by NGP, while 42% reported working with startups in some capacity.
The innovators on these teams are well aware that their reputation precedes them when it comes to bringing novel tech to market — and not in a flattering way. “I think historically we’ve done a poor job partnering with too many companies and spreading ourselves thin,” Quinn Nakayama, the senior director of grid research, innovation, and development at PG&E, told me. That’s led to a pattern known as “death by pilot,” in which utilities trial many promising solutions but are too risk-averse, cost-conscious, and slow-moving to deploy them, leaving the companies with no natural customers.
It doesn’t help that regulators such as public utilities commissions understandably require new investments to meet a strict “prudency” standard, proving that they can achieve the desired result at the lowest reasonable cost consistent with good practices. Yet this can be a high bar for tech that’s yet untested at scale. And because investor-owned utilities earn a guaranteed rate of return on approved infrastructure investments, they’re incentivized to pursue capital-intensive projects over smaller efficiency improvements. Freedom from the pressure of a competitive market has also traditionally meant freedom from the pressure to innovate.
But that’s changing.
To help bridge at least some of these divides, NGP set up a business development unit specifically for startups. “Their sole job is to work with our portfolio companies, work with our business units, and make sure that these things get deployed,” Tagare told me. Over 80% of the firm’s portfolio companies, he said, now have tie-ups of some sort with National Grid — be that a pilot or a long-term deployment — while “many” have secured multi-million dollar contracts with the utility.
While Tagare said that NGP is already reaping the benefits from investments in AI to streamline internal operations and improve critical services, hardware is slower to get to market. The startups in this category run the gamut from immediately deployable technologies to those still five or more years from commercialization. LineVision, a startup operating across parts of National Grid’s service territories in upstate New York and the U.K., is a prime example of the former. Its systems monitor the capacity of transmission lines in real-time via sensors and environmental data analytics, thus allowing utilities to safely push 20% to 30% more power through the wires as conditions permit.
There’s also TS Conductor, a materials science startup that’s developed a novel conductor wire with a lightweight carbon core and aluminum coating that can double or triple a line’s capacity without building new towers and poles. It’s a few years from achieving the technical and safety validation necessary to become an approved supplier for National Grid. Then five or more years down the line, NGP hopes to be able to deploy the startup Veir’s superconductors, which promise to boost transmission capacity five- to tenfold with materials that carry electricity with virtually zero resistance. But because this requires cooling the lines to cryogenic temperatures — and the bulky insulation and cooling systems need to do so — it necessitates a major infrastructure overhaul.
PG&E, for its part, is pursuing similar efficiency goals as it trials tech from startups including Heimdell Power and Smart Wires, which aim to squeeze more power out of the utility’s existing assets. But because the utility operates in California — the U.S. leader in EV adoption, with strong incentives for all types of home electrification — it’s also focused on solutions at the grid edge, where the distribution network meets customer-side assets like smart meters and EV charging infrastructure.
For example, the utility has a partnership with smart electric panel maker Span, which allows customers to adopt electric appliances such as heat pumps and EV chargers without the need for expensive electrical upgrades. Span’s device connects directly to a home’s existing electric panel, enabling PG&E to monitor and adjust electricity use in real time to prevent the panel from overloading while letting customers determine what devices to prioritize powering. Another partnership with smart infrastructure company Itron has similar aims — allowing customers to get EV fast chargers without a panel upgrade, with the company’s smart meters automatically adjusting charging speed based on panel limits and local grid conditions.
Of course, it’s natural to question how motivated investor-owned utilities really are to deploy this type of efficiency tech — after all, the likes of PG&E and National Grid make money by undertaking large infrastructure projects, not by finding clever means of avoiding them. And while both Nakayama and Tagare can’t deny what appears to be a fundamental misalignment of incentives, they both argue that there’s so much infrastructure investment needed — more than they can handle — that the friction is a non-issue.
“We have capital coming out of our ears,” Nakayama told me. Given that, he said, PG&E’s job is to accelerate interconnection for all types of loads, which will bring in revenue to offset the cost of the upgrades and thus lower customer rates. Tagare agreed.
“At least for the next — pick a number, five, seven, 10 years — I don’t see any of this slowing down,” he said.
And yet despite all that capital flow, PG&E still carries billions of dollars in wildfire-related financial obligations after its faulty equipment was found liable for sparking a number of blazes in Northern California in 2017 and 2018. The resulting legal claims drove the utility into bankruptcy in 2019, before it restructured and reemerged the following year. But the threat of wildfires in its service territory still looms large, which Nakayama said limits the company’s ability to allocate funds toward the basic poles and wires upgrades that are so crucial for easing the congested interconnection queue and bringing new load online.
Nakayama wants California’s legislature and courts to revise rules that make utilities strictly liable for wildfires caused by their equipment, even when all safety and mitigation procedures were followed. “In order for me to feel comfortable moving some of my investments out of wildfire into other areas of our business in a more accelerated fashion, I have to know that if I make the prudent investments for wildfire risk mitigation, I’m not going to be held liable for everything in my system,” he told me.
And while wildfire prevention itself is an area rich with technical innovation and a central focus of the utility’s startup ecosystem, Nakayama emphasizes that PG&E has a host of additional priorities to consider. “We need [virtual power plants]. We need new technologies. We need new investments. We need new capital. We need new wildfire-related liability,” he told me.
Utilities — especially his — rarely get seen as the good guys in this story. “I know that PGE gets vilified a lot,” Nakayama acknowledged. But he and his colleagues are “almost desperate to try to figure out how to bring down rates,” he promised.
Current conditions: The Central United States is facing this year’s largest outbreak of severe weather so far, with intense thunderstorms set to hit an area stretching from Texas to the Great Lakes for the next four days • Northern India is sweltering in temperatures as high as 13 degrees Celsius above historical norms • Australia issued evacuation alerts for parts of Queensland as floodwaters inundate dozens of roads.
The price of futures contracts for crude oil fell below $85 per barrel Monday after President Donald Trump called the war against Iran “very complete, pretty much,” declaring that there was “nothing left in a military sense” in the country. “They have no navy, no communications, they’ve got no air force. Their missiles are down to a scatter. Their drones are being blown up all over the place, including their manufacturing of drones,” Trump told CBS News in a phone interview Monday. “If you look, they have nothing left.”
The dip, just a day after prices surged well past $100 per barrel, highlights what Heatmap’s Matthew Zeitlin described as the challenge of depending too much on fossil fuels for a payday. “Even $85 is substantially higher than the $57 per barrel price from the end of last year. At that point, forecasters from both the public and the private sectors were expecting oil to stick around $60 a barrel through 2026,” he wrote. “Of course, crude oil itself is not something any consumer buys — but those high prices would likely feed through to higher consumer prices throughout the U.S. economy.”

The global wind industry set a record last year, adding 169 gigawatts of turbines throughout 2025, according to the latest analysis from the consultancy BloombergNEF. The 38% surge compared to 2024 came as the momentum in the sector shifted to Asia. Chinese companies made up eight of the top 10 global wind turbine suppliers, the report found, as domestic installations in the People’s Republic reached an all-time high. India, meanwhile, edged out the U.S. and Germany as the world’s second largest market after China. Of all global wind additions, 161 gigawatts, or 95%, were onshore turbines, mostly spurred on by the domestic boom in China. Not only did that same building blitz help Beijing-based Goldwind hold onto its top spot as the world’s leading turbine supplier, it vaulted Chinese manufacturers into the next five slots in the global ranking. “Thanks to stable long-term policy support, wind installations over the past decade have become increasingly concentrated in mainland China,” Cristian Dinca, wind associate at BloombergNEF and lead author of the report, said in a statement. “Chinese manufacturers consistently top the global rankings. They benefitted particularly in 2025, as companies and provinces rushed to commission projects ahead of power market reforms and to meet targets set out in the Five Year Plan.”
Like in solar and batteries, the domestic boom in China is starting to spill over abroad. As Matthew wrote last year, Chinese manufacturers are making a big push into the European market.
Arizona’s utility regulator has repealed rules requiring electricity providers to generate at least 15% of their energy from renewables. Citing “dramatic” changes to the renewable energy landscape, the Arizona Corporation Commission said the cost to ratepayers of the rules adopted two decades ago was no longer justifiable, Utility Dive reported Monday. Since the rules first took effect in 2006, the utilities Arizona Public Services, Tucson Electric Power, and UniSource Energy Services “have collected more than $2.3 billion” in “surcharges from all customer classes to meet these mandates,” the regulator said in a press release following the March 4 ruling. “The mandates are no longer needed and the costs are no longer justified.”
Sign up to receive Heatmap AM in your inbox every morning:
Reflect Orbital wants to launch 50,000 giant mirrors into space to bounce sunlight to the night side of the planet to power solar farms after sunset, provide lighting to rescue workers, and light city streets. Now, The New York Times reported Monday, the Hawthorne, California-based startup is asking the Federal Communications Commission for permission to send its first prototype satellite into space with a 60-foot-wide mirror. The company, which has raised more than $28 million from investors, could launch its test project as early as this summer. The public comment period on the FCC application closed yesterday. “We’re trying to build something that could replace fossil fuels and really power everything,” Ben Nowack, Reflect Orbital’s chief executive, told the newspaper.
It’s emblematic of the kind of audacious climate interventions on which investors are increasingly gambling. Last fall, Heatmap’s Robinson Meyer broke news that Stardust Solutions, a startup promising to artificially cool the planet by spraying aerosols into the atmosphere that reflect the sun’s light back into space, had raised $60 million to commercialize its technology. In December, Heatmap’s Katie Brigham had a scoop on the startup Overview Energy raising $20 million to build panels in space and beam solar power back down to Earth.
Emerald AI is a startup whose software Katie wrote last year “could save the grid” by helping data centers to ramp electricity usage up and down like a smart thermostat to allow more computing power to come online on the existing grid. InfraPartners is a company that designs, manufactures, and deploys prefabricated, modular data centers parts. You don’t need to be an expert in the data center industry’s energy problems to hear the wedding bells ringing. On Tuesday, the two companies announced a deal to partner on what they’re calling “flex-ready data centers,” a version of InfraPartners’ off the shelf computing hardware that comes equipped with Emerald AI’s software. “Building more infrastructure the way we have historically will not be fast enough. We need to make the infrastructure we have more intelligent by leveraging AI,” Bal Aujla, InfraPartners’ director of advanced research and engineering, said in a statement. “This partnership will turn data centers from grid constraints into grid partners and unlock more usable capacity from existing infrastructure. The result will be enhanced AI deployment without compromising reliability or sustainability.” Rather than rush to invest in big new power plants, Emerald AI chief scientist Ayse Coskun said making data centers flexible means “we can prudently expand our grid.”
War in Iran may be halting shipments of oil and liquified natural gas out of the Persian Gulf. But that isn’t stopping Chinese clean energy manufacturers from preparing to send shipments toward the war-torn region. Despite the conflict, the Jiangsu-based Shuangliang announced last week that it had delivered 80 megawatts of electrolyzers to a Chinese port for shipment to a 300-megawatt green hydrogen and ammonia plant in the special economic zone in Duqm, Oman. I know what you’re going to say: Oman’s status as the region’s Switzerland — a diplomatic powerhouse with a modern history of strategic neutrality in even the most heated geopolitical conflicts — means it isn’t a target for Iranian missiles. And there’s no guarantee the shipment will head there immediately. But it’s a sign of how determined China’s electrolyzer industry is to sell its hardware overseas amid inklings of a domestic slowdown.
Topsy turvy oil prices aren’t great for the U.S.
Oil prices are all over the place as markets reopened this week, climbing as high as $120 a barrel before crashing to around $85 after Donald Trump told CBS News that the war with Iran “is very complete, pretty much,” and that he was “thinking about taking it over,” referring to the Strait of Hormuz, the artery through which about a third of the world’s traded oil flows.
Even $85 is substantially higher than the $57 per barrel price from the end of last year. At that point, forecasters from both the public and the private sectors were expecting oil to stick around $60 a barrel through 2026.
Of course, crude oil itself is not something any consumer buys — but those high prices would likely feed through to higher consumer prices throughout the U.S. economy. That includes the price of gasoline, of course, which has risen by about $0.50 a gallon in the past month, according to AAA, — and jet fuel, which will mean increased travel costs. “Book your airfares now if they haven’t moved already,” Skanda Amarnath, the executive director of the economic policy think tank Employ America, told me.
High oil prices also raise the price of goods and services not directly linked to oil prices — groceries, for instance. “The cost of food, especially at the grocery store, is a function of the cost of diesel,” which fuels the trucks that get food to shelves, Amarnath told me. Diesel prices have risen even more than gasoline in the past week, by over $0.85 a gallon.
“We’ll see how long these prices stay elevated, how they feed their way through the supply chain and the value chain. But it’s clearly the case that it is a pretty adverse situation for both businesses and consumers.”
The oil market is going through one of the largest physical shocks in its modern history. Bloomberg’s Javier Blas estimates that of the 15 million barrels per day that regularly flow through the Strait of Hormuz, only about a third is getting through to the global market, whether through the strait itself or by alternative routes, such as the pipeline from Saudi Arabia’s eastern oil fields to the Red Sea.
Global daily oil production is just above 100 million barrels per day, meaning that around 10% of the oil supply on the market is stuck behind an effective blockade.
“The world is suddenly ‘short’ a volume that, in normal times, would dwarf almost any supply/demand imbalance we debate,” Morgan Stanley oil analyst Martjin Rats wrote in a note to clients on Sunday.
The fact that the U.S. is itself a leading producer and exporter of oil will only provide so much relief. Private sector economists have estimated that every $10 increase in the price of oil reduces economic growth somewhere between 0.1 and 0.2 percentage points.
“Petroleum product prices here in the U.S. tend to reflect global market conditions, so the price at the pump for gasoline and diesel reflect what’s going on with global prices,” Ben Cahill, a senior associate at the Center for Strategic and International Studies, told me. “What happens in the rest of the world still has a deep impact on U.S. energy prices.”
To the extent the U.S. economy benefits from its export capacity, the effects are likely localized to areas where oil production and export takes place, such as Texas and Louisiana. For the economy as a whole, higher oil prices will improve the “terms of trade,” essentially a measure of the value of imports a certain quantity of exports can “buy,” Ryan Cummings, chief of staff at Stanford Institute for Economic Policymaking, told me.
Could the U.S. oil industry ramp up production to capture those high prices and induce some relief?
Oil industry analysts, Heatmap founding executive editor Robinson Meyer, and the TV show Landman have all theorized that there is a “goldilocks” range of oil prices that are high enough to encourage exploration and production but not so high as to take out the economy as a whole. This range starts at around $60 or $70 on the low end and tops out at around $90 or $95. Above that, the economic damage from high prices would likely outweigh any benefit to drillers from expanded production.
And that’s if production were to expand at all.
“Capital discipline” has been the watchword of the U.S. oil and gas industry for years since the shale boom, meaning drillers are unlikely to chase price spikes by ramping up production heedlessly, CSIS’ Ben Cahill told me. “I think they’ll be quite cautious about doing that,” he said.