You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
Ice is melting — but what does that mean for climate science?
As is usually the case, one of the most basic questions in climate science has also been one of the most difficult to answer: How much energy is the Earth sending out into space? The pair of shoebox-sized satellites that comprise PREFIRE — Polar Radiant Energy in the Far-InfraRed Experiment — could very well provide the answer.
Principal investigator Tristan L’Ecuyer, a professor in the Department of Atmospheric and Oceanic Sciences at the University of Wisconsin-Madison and the director of the Cooperative Institute for Meteorological Satellite Studies, spoke with Heatmap about PREFIRE. Tentatively scheduled to launch in May, the project stands not only to make future climate models more accurate, but could also help shape a new generation of atmospheric exploration.
The interview has been edited for length and clarity.
Could you tell me a little bit about your research and the work that you do?
A lot of our climate information comes from models — where I come in is trying to make sure that those predictions are rooted in actual observations of our planet. But it’s impossible to cover the whole globe with a temperature sensor or water vapor [sensor] or those sorts of things, so I’ve always focused on using satellite observations, and in particular I’ve been focusing on the exchange of energy.
Basically, what drives the climate is the incoming energy from the sun and how that’s balanced by the thermal energy that the Earth emits. One of the big influencers of that balance are clouds — they reflect the sunlight, but they also have a greenhouse effect of their own; they trap the thermal energy emitted. So I’ve spent most of my career trying to understand the effects of clouds on the climate and how that might change if the climate warms.
And what’s the goal of this particular mission?
One of the fastest changing regions on Earth right now is the polar regions — I think a lot of people are aware of that. Normally, the polar regions are very cold — they reflect a lot of sunlight just because of the ice surface. But as the ice surface melts, the ocean is a lot darker than ice, and so [the poles] can actually absorb more of the solar radiation that’s coming in.
A lot of people say, “Well, okay, but that’s the Arctic. I don’t live there.” But the way the climate works is that in order to create an equilibrium between these really, really cold polar caps and the really, really warm tropics. It’s just like heating the end of a rod — the rod is going to transfer some of the heat from the hot end to the cold end to establish an equilibrium between them. The Earth does the same thing, but the way it does that is through our weather systems. So basically, how cold the polar region is versus the equator is what’s going to govern how severe our weather is in the mid-latitudes.
What we’re trying to do is make measurements of, basically, how that thermal energy is distributed. We just have a lack of understanding right now — or it’s more that the understanding comes from isolated, individual field projects, and what we really want to do is map out the whole Arctic and understand all of the different regions and how it’s changing.
How do you expect your findings to influence our climate models? Or how significantly do you expect them to affect the climate models?
This is quite unusual for a satellite project, we actually have climate modelers as part of our team. There’s the people that take, for example, the Greenland ice sheet, and they model things like the melting of the ice, how heat transports into the ice sheet, how the water once it melts percolates through the ice and then runs off at the bottom of the glacier, or even on top of the glacier. And then I have a general climate modeling group that basically uses climate models to project future climate.
There’s two ways that's going to happen. The first is we’ve developed a tool that allows us to kind of simulate what our satellite would see if it was flying in a climate model as opposed to around the real Earth — we can simulate exactly what the climate model is suggesting the satellite should see. And then of course, we’re making the real observations with the satellite. We can compare the two and evaluate, in today’s climate, how well is that climate model reproducing what the satellites see?
The other way is we’re going to generate models of how much heat comes off of various surfaces — ice surfaces, water surfaces, snow surfaces — and that information can be used to create a new module that goes right into the climate model and improves the way it represents the surface.
So what do these satellites look like and how do they work?
Our satellite is called a CubeSat. It’s not very big at all, maybe a foot wide, a foot-and-a-half or so long. There’s a little aperture, a little hole on the end of the satellite that lets the thermal energy from the Earth go in, and then the the rest of the satellite is basically just this big box that has a radio and a transmitter. In total, I think the whole thing weighs about 15 kilograms.
Because it's relatively small and relatively inexpensive, we're actually able to have two of those instead of just having one, and what that lets us do is put them into different orbits. At some point that will cross and see the same spot on the ground — let’s say somewhere in the center of Greenland — but up to eight or nine hours apart. Let’s say it melts in between, we’ll be able to understand how that melting process affected the heat that was emitted from the surface into the atmosphere.
How big of a deal do you think this is? Or how big of a deal do you think it could be?
There’s more than a couple of aspects to this. To really segue from the last question to this one, the reason [the satellites are] inexpensive, it’s not that they’re low-quality. It’s actually because they’re very uniform sizes and shapes. You can mass produce them. And so it’s that fact, coupled with the fact that we can now do real science on this small platform. We’ve been able to miniaturize the technology. If we can keep demonstrating that these missions are viable and producing realistic science data, this could be the future of the field.
Coming back to the polar climate, we absolutely know that the poles are warming at a very alarming rate. We know that the ice sheets are melting. We know that this has implications for the weather in the lower latitudes where we live, and for sea level. But when you try to predict that 100 years from now, there’s quite a range of different answers, from very catastrophic to still pretty bad. Depending on which of those answers is correct, it really dictates what we need to do today. How quickly do we need to adapt to a rising sea level, or to stronger storms or more frequent storms? After this mission, we will be able to improve the climate models in such a way that we’ll have a narrower range of possibilities.
The other thing that’s exciting is also just the unknown. There’s always new things that you learn by measuring something for the first time. We might learn something about the tropics, we might learn something about the upper atmosphere. There are some people in mountainous areas that are quite interested in the measurements — at the top of mountains, it’s actually quite similar in climate to the Arctic. So I’m also really excited about what happens when the science community in general explores that data for the first time.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
We’re powering data centers every which way these days.
The energy giant ExxonMobil is planning a huge investment in natural gas-fired power plants that will power data centers directly, a.k.a. behind the meter, meaning they won’t have to connect to the electric grid. That will allow the fossil fuel giant to avoid making the expensive transmission upgrades that tend to slow down the buildout of new electricity generation. And it’ll add carbon capture to boot.
The company said in a corporate update that it plans to build facilities that “would use natural gas to generate a significant amount of high-reliability electricity for a data center,” then use carbon capture to “remove more than 90% of the associated CO2 emissions, then transport the captured CO2 to safe, permanent storage deep underground.” Going behind the meter means that this generation “can be installed at a pace that other alternatives, including U.S. nuclear power, cannot match,” the company said.
The move represents a first for Exxon, which is famous for its far-flung operations to extract and process oil and natural gas but has not historically been in the business of supplying electricity to customers. The company is looking to generate 1.5 gigawatts of power, about 50% more than a large nuclear reactor, The New York Timesreported.
Exxon’s announcement comes as thepower industry has reached an inflection point thanks to new demand from data centers to power artificial intelligence, electrification of transportation and heating, and new manufacturing investment. The demand for new power is immense, yet the industry’s ability to provide it quickly is limited both by the intermittent nature of cheap renewable power like solar and storage — plus the transmission capacity it requires — and by theregulatory barriers and market uncertainty around building new natural gas and nuclear power. While technology companies are starting to invest in bringing more nuclear power onto the grid,those projects won’t begin to bear fruit until the 2030s at the earliest.
Exxon is also not the only energy giant looking at behind-the-meter gas.
“This county is blessed with an abundance of natural gas,” Chevron chief executive Mike Wirthsaid at a recent event hosted by the Atlantic Council. “I think what we’re likely to see is that gas turbine generation is going to be a big part of the solution set, and a lot of it may be what’s called behind the meter … to support data centers.”
At the same time, the so-called hyperscalers are still making massive investments in renewables. Google, the investment firm TPG, and the energy developer Intersectannounced a $20 billion investment “to synchronize new clean power generation with data center growth in a novel way,” Google’s President and Chief Investment Officer Ruth Porat wrote in a company blog post on Tuesday.
While Google was a pioneer in developing new renewable power to offset emissions from its operations and recently formed a partnership with Microsoft and the steel company Nucor to foster energy technology that can deliver clean power 24/7, this new project will be focused on “co-locating grid-connected carbon-free energy and data center investments into closely-linked infrastructure projects.”
These projects — the data centers and the clean power generation — would be sited close to each other, however they would not be behind the meter, a Google executive told Canary Media. Instead, Intersect will build “new clean energy assets in regions and projects of interest,” according to the blog post, with Google then acting as an offtaker for the power “as an anchor tenant in the co-located industrial park that would support data center development.” The Google data center and the Intersect-built power “would come online alongside its own clean power, bringing new generation capacity to the grid to meet our load, reduce time to operation and improve grid reliability.”
“This partnership is an evolution of the way hyperscalers and power providers have previously worked together,” Sheldon Kimber, Intersect chief executive, said in a press release. “We can and are developing innovative solutions to rapidly expand clean power capacity at scale while reducing the strain on the grid.”
But ... how?
President-elect Donald Trump on Tuesday rocked the energy world when he promised “fully expedited approvals and permits, including, but in no way limited to, all Environmental approvals” for “Any person or company investing ONE BILLION DOLLARS, OR MORE, in the United States of America,” in a post on Truth Social Tuesday.
“GET READY TO ROCK!!!” he added.
Trump has frequently derided regulatory barriers to development, including in his announcements of various economic and policy roles in his upcoming administration. His designee for Secretary of the Interior, Doug Burgum, for instance, will also head a
National Energy Council that will “oversee the path to U.S. ENERGY DOMINANCE by cutting red tape … by focusing on INNOVATION over longstanding, but totally unnecessary, regulation.”
When Trump
announced his nomination of Lee Zeldin to head the Environmental Protection Agency, he said Zeldin would “ensure fair and swift deregulatory decisions that will be enacted in a way to unleash the power of American business.”
Current interpretations of existing laws dictate that any project constituting a major federal action (e.g. one that uses public lands) must be reviewed under the National Environmental Policy Act, the country’s signature permitting law. Federal courts are often asked in litigation to sign off on whether that review process — although not the outcome — was sufficient.
Regardless of any changes Trump may make to the federal regulatory system as president, that infrastructure is already in flux. The D.C. Circuit Court of Appeals recently issued a ruling that throws into doubt decades of NEPA enforcement. Also on Tuesday, the Supreme Court heard a separate case on the limits of NEPA as it relates to aproposed rail line expansion to transport oil from Utah’s Uinta Basin to refineries on the Gulf of Mexico. Although the court is unlikely to issue a decision until next year, its current membership has shown itself plenty willing to scrap longstanding precedent in the name of cutting the regulatory state down to size.
Trump did not support his announcement with any additional materials laying out the legal authorities he plans to exercise to exempt these projects from regulation or proposed legislation, but it already attracted criticism from environmentalists, with the Sierra Club describing it as a “plan to sell out communities and environment to the highest bidder.It’s also unclear whether Trump was referring to foreign direct investment in the United States, of which there was $177 billion in 2022,according to the Department of Commerce.
Trump’s appointed co-deregulator-in-chief, for one, approved of his message today. “This is awesome 🚀🇺🇸,” Elon Musk wrote on X in response.
Companies are racing to finish the paperwork on their Department of Energy loans.
Of the over $13 billion in loans and loan guarantees that the Energy Department’s Loan Programs Office has made under Biden, nearly a third of that funding has been doled out in the month since the presidential election. And of the $41 billion in conditional commitments — agreements to provide a loan once the borrower satisfies certain preconditions — that proportion rises to nearly half. That includes some of the largest funding announcements in the office’s history: more than $7.5 billion to StarPlus Energy for battery manufacturing, $4.9 billion to Grain Belt Express for a transmission project, and nearly $6.6 billion to the electric vehicle company Rivian to support its new manufacturing facility in Georgia.
The acceleration represents a clear push by the outgoing Biden administration to get money out the door before President-elect Donald Trump, who has threatened to hollow out much of the Department of Energy, takes office. Still, there’s a good chance these recent conditional commitments won’t become final before the new administration takes office, as that process involves checking a series of nontrivial boxes that include performing due diligence, addressing or mitigating various project risks, and negotiating financing terms. And if the deals aren’t finalized before Trump takes office, they’re at risk of being paused or cancelled altogether, something the DOE considers unwise, to put it lightly.
“It would be irresponsible for any government to turn its back on private sector partners, states, and communities that are benefiting from lower energy costs and new economic opportunities spurred by LPO’s investments,” a spokesperson wrote to me in an email.
The once nearly dormant LPO has had a renaissance under the Biden administration and the office’s current director, Jigar Shah. The Inflation Reduction Act supercharged its lending authority to $400 billion, from just $40 billion when Biden took office. Then a week after the election, the office announced that it had recalibrated its risk estimates for the loan guarantees that it makes under the Energy Infrastructure Reinvestment program, which works to modernize and repurpose existing energy infrastructure to make it cleaner and more energy efficient. As the office explained, these projects “may reflect a relatively moderate risk profile in comparison to typical projects LPO finances with higher project risk.” When there’s less risk involved, LPO doesn’t have to set aside as much money to cover a possible default, which in this case has allowed the office to more than quadruple its funding for qualifying projects.
It’s not just that LPO staffers are working fast, though that’s part of it — it’s also that loan beneficiaries have picked up their pace in responding to the LPO. As Shah emphasized today at the LPO’s second annual Demonstrate Deploy Decarbonize conference, finalizing conditional commitments largely depends on companies getting their ducks in a row as quickly as possible. “I do think that right now borrowers are sufficiently motivated to move more quickly than they have probably a year ago,” Shah said. “It's up to the borrowers. Our process hasn’t changed. Their ability to move through it faster is in their control.”
Shah noted that though timelines may be accelerating, the office’s due diligence procedures have remained the same. Thus far, the project that has moved the fastest from a conditional commitment to a finalized loan was for a clean hydrogen and energy storage facility in Utah. That took 43 days, and there are 46 left in Biden’s presidency. Let’s see what the LPO can do.