You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
Over a dozen methane satellites are now circling the Earth — and more are on the way.

On Monday afternoon, a satellite the size of a washing machine hitched a ride on a SpaceX rocket and was launched into orbit. MethaneSAT, as the new satellite is called, is the latest to join more than a dozen other instruments currently circling the Earth monitoring emissions of the ultra-powerful greenhouse gas methane. But it won’t be the last. Over the next several months, at least two additional methane-detecting satellites from the U.S. and Japan are scheduled to join the fleet.
There’s a joke among scientists that there are so many methane-detecting satellites in space that they are reducing global warming — not just by providing essential data about emissions, but by blocking radiation from the sun.
So why do we keep launching more?
Despite the small army of probes in orbit, and an increasingly large fleet of methane-detecting planes and drones closer to the ground, our ability to identify where methane is leaking into the atmosphere is still far too limited. Like carbon dioxide, sources of methane around the world are numerous and diffuse. They can be natural, like wetlands and oceans, or man-made, like decomposing manure on farms, rotting waste in landfills, and leaks from oil and gas operations.
There are big, unanswered questions about methane, about which sources are driving the most emissions, and consequently, about tackling climate change, that scientists say MethaneSAT will help solve. But even then, some say we’ll need to launch even more instruments into space to really get to the bottom of it all.
Measuring methane from space only began in 2009 with the launch of the Greenhouse Gases Observing Satellite, or GOSAT, by Japan’s Aerospace Exploration Agency. Previously, most of the world’s methane detectors were on the ground in North America. GOSAT enabled scientists to develop a more geographically diverse understanding of major sources of methane to the atmosphere.
Soon after, the Environmental Defense Fund, which led the development of MethaneSAT, began campaigning for better data on methane emissions. Through its own, on-the-ground measurements, the group discovered that the Environmental Protection Agency’s estimates of leaks from U.S. oil and gas operations were totally off. EDF took this as a call to action. Because methane has such a strong warming effect, but also breaks down after about a decade in the atmosphere, curbing methane emissions can slow warming in the near-term.
“Some call it the low hanging fruit,” Steven Hamburg, the chief scientist at EDF leading the MethaneSAT project, said during a press conference on Friday. “I like to call it the fruit lying on the ground. We can really reduce those emissions and we can do it rapidly and see the benefits.”
But in order to do that, we need a much better picture than what GOSAT or other satellites like it can provide.
In the years since GOSAT launched, the field of methane monitoring has exploded. Today, there are two broad categories of methane instruments in space. Area flux mappers, like GOSAT, take global snapshots. They can show where methane concentrations are generally higher, and even identify exceptionally large leaks — so-called “ultra-emitters.” But the vast majority of leaks, big and small, are invisible to these instruments. Each pixel in a GOSAT image is 10 kilometers wide. Most of the time, there’s no way to zoom into the picture and see which facilities are responsible.

Point source imagers, on the other hand, take much smaller photos that have much finer resolution, with pixel sizes down to just a few meters wide. That means they provide geographically limited data — they have to be programmed to aim their lenses at very specific targets. But within each image is much more actionable data.
For example, GHGSat, a private company based in Canada, operates a constellation of 12 point-source satellites, each one about the size of a microwave oven. Oil and gas companies and government agencies pay GHGSat to help them identify facilities that are leaking. Jean-Francois Gauthier, the director of business development at GHGSat, told me that each image taken by one of their satellites is 12 kilometers wide, but the resolution for each pixel is 25 meters. A snapshot of the Permian Basin, a major oil and gas producing region in Texas, might contain hundreds of oil and gas wells, owned by a multitude of companies, but GHGSat can tell them apart and assign responsibility.
“We’ll see five, 10, 15, 20 different sites emitting at the same time and you can differentiate between them,” said Gauthier. “You can see them very distinctly on the map and be able to say, alright, that’s an unlit flare, and you can tell which company it is, too.” Similarly, GHGSat can look at a sprawling petrochemical complex and identify the exact tank or pipe that has sprung a leak.
But between this extremely wide-angle lens, and the many finely-tuned instruments pointing at specific targets, there’s a gap. “It might seem like there’s a lot of instruments in space, but we don’t have the kind of coverage that we need yet, believe it or not,” Andrew Thorpe, a research technologist at NASA’s Jet Propulsion Laboratory told me. He has been working with the nonprofit Carbon Mapper on a new constellation of point source imagers, the first of which is supposed to launch later this year.
The reason why we don’t have enough coverage has to do with the size of the existing images, their resolution, and the amount of time it takes to get them. One of the challenges, Thorpe said, is that it’s very hard to get a continuous picture of any given leak. Oil and gas equipment can spring leaks at random. They can leak continuously or intermittently. If you’re just getting a snapshot every few weeks, you may not be able to tell how long a leak lasted, or you might miss a short but significant plume. Meanwhile, oil and gas fields are also changing on a weekly basis, Joost de Gouw, an atmospheric chemist at the University of Colorado, Boulder, told me. New wells are being drilled in new places — places those point-source imagers may not be looking at.
“There’s a lot of potential to miss emissions because we’re not looking,” he said. “If you combine that with clouds — clouds can obscure a lot of our observations — there are still going to be a lot of times when we’re not actually seeing the methane emissions.”
De Gouw hopes MethaneSAT will help resolve one of the big debates about methane leaks. Between the millions of sites that release small amounts of methane all the time, and the handful of sites that exhale massive plumes infrequently, which is worse? What fraction of the total do those bigger emitters represent?
Paul Palmer, a professor at the University of Edinburgh who studies the Earth’s atmospheric composition, is hopeful that it will help pull together a more comprehensive picture of what’s driving changes in the atmosphere. Around the turn of the century, methane levels pretty much leveled off, he said. But then, around 2007, they started to grow again, and have since accelerated. Scientists have reached different conclusions about why.
“There’s lots of controversy about what the big drivers are,” Palmer told me. Some think it’s related to oil and gas production increasing. Others — and he’s in this camp — think it’s related to warming wetlands. “Anything that helps us would be great.”
MethaneSAT sits somewhere between the global mappers and point source imagers. It will take larger images than GHGSat, each one 200 kilometers wide, which means it will be able to cover more ground in a single day. Those images will also contain finer detail about leaks than GOSAT, but they won’t necessarily be able to identify exactly which facilities the smaller leaks are coming from. Also, unlike with GHGSat, MethaneSAT’s data will be freely available to the public.
EDF, which raised $88 million for the project and spent nearly a decade working on it, says that one of MethaneSAT’s main strengths will be to provide much more accurate basin-level emissions estimates. That means it will enable researchers to track the emissions of the entire Permian Basin over time, and compare it with other oil and gas fields in the U.S. and abroad. Many countries and companies are making pledges to reduce their emissions, and MethaneSAT will provide data on a relevant scale that can help track progress, Maryann Sargent, a senior project scientist at Harvard University who has been working with EDF on MethaneSAT, told me.

It could also help the Environmental Protection Agency understand whether its new methane regulations are working. It could help with the development of new standards for natural gas being imported into Europe. At the very least, it will help oil and gas buyers differentiate between products associated with higher or lower methane intensities. It will also enable fossil fuel companies who measure their own methane emissions to compare their performance to regional averages.
MethaneSAT won’t be able to look at every source of methane emissions around the world. The project is limited by how much data it can send back to Earth, so it has to be strategic. Sargent said they are limiting data collection to 30 targets per day, and in the near term, those will mostly be oil and gas producing regions. They aim to map emissions from 80% of global oil and gas production in the first year. The outcome could be revolutionary.
“We can look at the entire sector with high precision and track those emissions, quantify them and track them over time. That’s a first for empirical data for any sector, for any greenhouse gas, full stop,” Hamburg told reporters on Friday.
But this still won’t be enough, said Thorpe of NASA. He wants to see the next generation of instruments start to look more closely at natural sources of emissions, like wetlands. “These types of emissions are really, really important and very poorly understood,” he said. “So I think there’s a heck of a lot of potential to work towards the sectors that have been really hard to do with current technologies.”
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
What happens when one of energy’s oldest bottlenecks meets its newest demand driver?
Often the biggest impediment to building renewable energy projects or data center infrastructure isn’t getting government approvals, it’s overcoming local opposition. When it comes to the transmission that connects energy to the grid, however, companies and politicians of all stripes are used to being most concerned about those at the top – the politicians and regulators at every level who can’t seem to get their acts together.
What will happen when the fiery fights on each end of the wire meet the broken, unplanned spaghetti monster of grid development our country struggles with today? Nothing great.
The transmission fights of the data center boom have only just begun. Utilities will have to spend lots of money on getting energy from Point A to Point B – at least $500 billion over the next five years, to be precise. That’s according to a survey of earnings information published by think tank Power Lines on Tuesday, which found roughly half of all utility infrastructure spending will go toward the grid.
But big wires aren’t very popular. When Heatmap polled various types of energy projects last September, we found that self-identified Democrats and Republicans were mostly neutral on large-scale power lines. Independent voters, though? Transmission was their second least preferred technology, ranking below only coal power.
Making matters far more complex, grid planning is spread out across decision-makers. At the regional level, governance is split into 10 areas overseen by regional transmission organizations, known as RTOs, or independent system operators, known as ISOs. RTOs and ISOs plan transmission projects, often proposing infrastructure to keep the grid resilient and functional. These bodies are also tasked with planning the future of their own grids, or at least they are supposed to – many observers have decried RTOs and ISOs as outmoded and slow to respond. Utilities and electricity co-ops also do this planning at various scales. And each of these bodies must navigate federal regulators and permitting processes, utility commissions for each state they touch, on top of the usual raft of local authorities.
The mid-Atlantic region is overseen by PJM Interconnection, a body now under pressure from state governors in the territory to ensure the data center boom doesn’t unnecessarily drive up costs for consumers. The irony, though, is that these governors are going to be under incredible pressure to have their states act against individual transmission projects in ways that will eventually undercut affordability.
Virginia, for instance – known now as Data Center Alley – is flanked by states that are politically diverse. West Virginia is now a Republican stronghold, but was long a Democratic bastion. Maryland had a Republican governor only a few years ago. Virginia and Pennsylvania regularly change party control. These dynamics are among the many drivers behind the opposition against the Piedmont Reliability Project, which would run from a nuclear plant in Pennsylvania to northern Virginia, cutting across spans of Maryland farmland ripe for land use conflict. The timeline for this project is currently unclear due to administrative delays.
Another major fight is brewing with NextEra’s Mid-Atlantic Resiliency Link, or MARL project. Spanning four states – and therefore four utility commissions – the MARL was approved by PJM Interconnection to meet rising electricity demand across West Virginia, Virginia, Maryland and Pennsylvania. It still requires approval from each state utility commission, however. Potentially affected residents in West Virginia are hopping mad about the project, and state Democratic lawmakers are urging the utility commission to reject it.
In West Virginia, as well as Virginia and Maryland, NextEra has applied for a certificate of public convenience and necessity to build the MARL project, a permit that opponents have claimed would grant it the authority to exercise eminent domain. (NextEra has said it will do what it can to work well with landowners. The company did not respond to a request for comment.)
“The biggest problem facing transmission is that there’s so many problems facing transmission,” said Liza Reed, director of climate and energy at the Niskanen Center, a policy think tank. “You have multiple layers of approval you have to go through for a line that is going to provide broader benefits in reliability and resilience across the system.”
Hyperlocal fracases certainly do matter. Reed explained to me that “often folks who are approving the line at the state or local level are looking at the benefits they’re receiving – and that’s one of the barriers transmission can have.” That is, when one state utility commission looks at a power line project, they’re essentially forced to evaluate the costs and benefits from just a portion of it.
She pointed to the example of a Transource line proposed by PJM almost 10 years ago to send excess capacity from Pennsylvania to Maryland. It wasn’t delayed by protests over the line itself – the Pennsylvania Public Utilities Commission opposed the project because it thought the result would be net higher electricity bills for folks in the Keystone State. That’s despite whatever benefits would come from selling the electricity to Maryland and consumer benefits for their southern neighbors. The lesson: Whoever feels they’re getting the raw end of the line will likely try to stop it, and there’s little to nothing anyone else can do to stop them.
These hyperlocal fears about projects with broader regional benefits can be easy targets for conservation-focused environmental advocates. Not only could they take your land, the argument goes, they’re also branching out to states with dirtier forms of energy that could pollute your air.
“We do need more energy infrastructure to move renewable energy,” said Julie Bolthouse, director of land use for the Virginia conservation group Piedmont Environmental Council, after I asked her why she’s opposing lots of the transmission in Virginia. “This is pulling away from that investment. This is eating up all of our utility funding. All of our money is going to these massive transmission lines to give this incredible amount of power to data centers in Virginia when it could be used to invest in solar, to invest in transmission for renewables we can use. Instead it’s delivering gas and coal from West Virginia and the Ohio River Valley.”
Daniel Palken of Arnold Ventures, who previously worked on major pieces of transmission reform legislation in the U.S. Senate, said when asked if local opposition was a bigger problem than macro permitting issues: “I do not think local opposition is the main thing holding up transmission.”
But then he texted me to clarify. “What’s unique about transmission is that in order for local opposition to even matter, there has to be a functional planning process that gets transmission lines to the starting line. And right now, only about half the country has functional regional planning, and none of the country has functional interregional planning.”
It’s challenging to fathom a solution to such a fragmented, nauseating puzzle. One solution could be in Congress, where climate hawks and transmission reform champions want to empower the Federal Energy Regulatory Commission to have primacy over transmission line approvals, as it has over gas pipelines. This would at the very least contain any conflicts over transmission lines to one deciding body.
“It’s an old saw: Depending on the issue, I’ll tell you that I’m supportive of states’ rights,” Representative Sean Casten told me last December. “[I]t makes no sense that if you want to build a gas pipeline across multiple states in the U.S., you go to FERC and they are the sole permitting authority and they decide whether or not you get a permit. If you go to the same corridor and build an electric transmission that has less to worry about because there’s no chance of leaks, you have a different permitting body every time you cross a state line.”
Another solution could come from the tech sector thinking fast on its feet. Google for example is investing in “advanced” transmission projects like reconductoring, which the company says will allow it to increase the capacity of existing power lines. Microsoft is also experimenting with smaller superconductor lines they claim deliver the same amount of power than traditional wires.
But this space is evolving and in its infancy. “Getting into the business of transmission development is very complicated and takes a lot of time. That’s why we’ve seen data centers trying a lot of different tactics,” Reed said. “I think there’s a lot of interest, but turning that into specific projects and solutions is still to come. I think it’s also made harder by how highly local these decisions are.”
Plus more of the week’s biggest development fights.
1. Franklin County, Maine – The fate of the first statewide data center ban hinges on whether a governor running for a Democratic Senate nomination is willing to veto over a single town’s project.
2. Jerome County, Idaho – The county home to the now-defunct Lava Ridge wind farm just restricted solar energy, too.
3. Shelby County, Tennessee - The NAACP has joined with environmentalists to sue one of Elon Musk’s data centers in Memphis, claiming it is illegally operating more than two dozen gas turbines.
4. Richland County, Ohio - This Ohio county is going to vote in a few weeks on a ballot initiative that would overturn its solar and wind ban. I am less optimistic about it than many other energy nerds I’ve seen chattering the past week.
5. Racine County, Wisconsin – I close this week’s Hotspots with a bonus request: Please listen to this data center noise.
A chat with Scott Blalock of Australian energy company Wärtsilä.
This week’s conversation is with Scott Blalock of Australian energy company Wärtsilä. I spoke with Blalock this week amidst my reporting on transmission after getting an email asking whether I understood that data centers don’t really know how much battery storage they need. Upon hearing this, I realized I didn’t even really understand how data centers – still a novel phenomenon to me – were incorporating large-scale battery storage at all. How does that work when AI power demand can be so dynamic?
Blalock helped me realize that in some ways, it’s more of the same, and in others, it’s a whole new ballgame.
The following chat was lightly edited for clarity.
So help me understand how the battery storage side of your business is changing due to the rise in data center development.
We’re really in the early stages for energy storage. The boom is really in generation – batteries aren’t generators. They store, they shift, they smooth power, but they don’t generate the power from fuel. In this boom right now, everyone is trying to find either grid connections or on-site power generation. Those are the longest lead time items – they take a while – so we’re still in the early stages of those types of projects coming back and saying, we need to start procuring batteries. We need to start looking at the controls and how everything’s going to work together. That’s still a little bit in the future.
Are you seeing people deploy batteries responsibly, in an integrated way, or is it people unsure what they need?
There’s definitely uncertainty as to what they need. The requirements are still hard to nail down. A lot of the requirements come from the load curve of the AI workloads they’re doing, and that’s still a bit of a moving target. It’s the importance of knowing the whole system and planning that out in the modeling space.
The biggest space of all this is the load profile. Without a load profile, there’s uncertainty about what you’re going to need –
When you say load profile, what do you mean?
The AI workload. The GPUs. The volatility. In a synchronized training load, all of the GPUs are generally doing the same thing at the same time. They all reach a pause state at the same time, and you’re close to full power on the data center, and then they say, okay now we go idle. It has a little bit of a wait and then starts back up again.
It’s that square wave, very sharp changes in power – that’s the new challenge of an AI data center. That’s one of the new uses of BESS that’s being added compared to the traditional data center doing data storage. They’re more stable which use less power and are more stable.
The volatility is where some of the friction comes in, and that has to be handled by some technology.
So what you’re telling me is that data center developers do not know how much they need in terms of battery storage? Simply put, they don’t know how much power they need?
Traditionally, utility-scale batteries – the projects we’ve been doing – come from a PPA, an interconnect agreement. There’s something in place where they know exactly how many batteries they can install. They know how many megawatts they’re allowed to install. Then they come to us and they say, I need a 4-megawatt battery for two hours. Tell me how many batteries you’re going to give me.
In a data center, they don’t know that first number. They don’t know how many megawatts they need. So that’s the first question: well, how big of a battery do you need?
If you have a 1-gigawatt data center that means the load change is 60% of that – 600 megawatts is the step up-and-down. The starting point is 600 megawatts for two hours. That’s the starting point that’ll cover being able to take care of that volatility. The duration is a part of it, too. From there you get into more detailed studies.
When it comes to transmission, how much of a factor is it in how much storage a data center needs?
The first thing is whether it’s connected at all. The battery is a shock absorber for the whole system. If you are grid-connected, the BESS is still a stability asset – it’s still improving the power quality and stability at an interconnect. If you’re doing on-site generation, it becomes vital because you have only one system being controlled.
As far as when you talk about permitting and transmission, the details of that don’t really play that much into the BESS, but it’s tangentially related. The BESS is an important part of how you handle that situation. Whether you get to interconnect or not, it’s an extremely important asset in that mix.
With respect to the overall social license conversation, how does battery storage fit into the conversations around energy bills and strain on the grid?
Bias aside, I think it’s the most important piece.
If you look at the macro scale, it’s like transitioning to renewables where they’re intermittent; batteries turn intermittent generation from renewables into firm, dispatchable power. It’s still not going to be available all the time – you’re not going to turn a solar plant into a 24-hour baseload plant – but a battery allows you to shift the energy. It greatly alleviates the problem.
The other aspect is it’s a stability asset. The short version of that is you have big thermal plants – rotating metal masses that have momentum to them that stabilize everything on the grid. As you take those offline, the coal plants and the gas plants, the grid itself loses that inertia so it is more susceptible to spikes and failures because of small events. Batteries are able to synthesize that inertia.