You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
A new Nature paper outlines the relationship between rising temperatures and the literal rotation of the Earth.

Thinking too hard about time is a little like thinking too hard about blinking; it seems natural and intuitive until suddenly you’re sweating and it makes no sense at all. At least, that’s how I felt when I came across an incredible new study published in Nature this afternoon by Duncan Agnew, a geophysicist at the Scripps Institution of Oceanography, suggesting that climate change might be affecting global timekeeping.
Our internationally agreed-upon clock, Coordinated Universal Time (UTC), consists of two components: the one you’re familiar with, which is the complete rotation of the Earth around its axis, as well as the average taken from 400 atomic clocks around the world. Since the 1970s, UTC has added 27 leap seconds at irregular intervals to keep pace with atomic clocks as the Earth’s rotation has gradually slowed. Then that rotation started to speed up in 2016; June 29, 2022, set a record for the planet’s shortest day, with the Earth completing a full rotation 1.59 milliseconds short of 24 hours. Timekeepers anticipated at that point that we’d need our first-ever negative leap second around 2026 to account for the acceleration.
But such a model doesn’t properly account for the transformative changes the planet is undergoing due to climate change — specifically, the billions of tons of ice melting from Greenland and Antarctica every year.
Using mathematical modeling, Agnew found that the melt-off, as measured by gravity-tracking satellites, has again decreased the Earth’s angular velocity to the extent that a negative leap second will actually be required three years later than estimates, in 2029.
While a second here or there might not seem like much on a cosmic scale, as Agnew explained to me, these kinds of discrepancies throw into question the entire idea of basing our time system on the physical position of the Earth. Even more mind-bogglingly, Agnew’s modeling makes the astonishing case that so long as it is, climate change will be “inextricably linked” to global timekeeping.
Confused? So was I, until Agnew talked me through his research. Our conversation has been edited and condensed for clarity.
How did you get involved in researching this? I’d never have expected there to be a relationship between climate change and timekeeping.
Pure accident. I’m a geophysicist and I have an avocational interest in timekeeping, so I know all about leap seconds and the history of atomic clocks. I thought about writing a paper figuring out statistically what the next century would bring in terms of leap seconds.
When I started working on the paper, I realized there was a signal that I needed to allow for, which was the change induced by melting ice — which has been studied, there are plenty of papers on this satellite gravity signal. But nobody has, as far as I can tell, related it to rotation. Mostly because, from a geophysical standpoint, that’s not very interesting.
Interesting. Or, well, I guess not interesting.
I mean, there is geophysical literature on this, but it’s largely, Okay, we see this signal, and gravity doesn’t mesh with what we think we know about ice melt. Does it measure what we think we know about sea level change? How does the geophysics all fit together? And the fact that it changes Earth’s rotation is kind of a side issue.
I did not know about this when I got started on this project; it appeared as I was working on it. I thought, “Wait, I need to allow for this.” And when I did, it produced the — I don’t want to use the words “more important” because of the climate change part, but it produced a secondary result, which was that this potential for a negative leap second became clear.
Walk me through how the ice melting at the poles changes the Earth’s rotation.
This is the part that’s easy to explain. Ice melts. A lot of water that used to be at the poles is now distributed all over the ocean. Some of it is close to the equator. The standard picture for what’s called change of angular velocity because of moment of inertia — ignore all the verbiage — but the standard picture is of an ice skater who is spinning. She has her arms over her head. When she puts her arms out, she will slow down — like the water going from the poles to the equator. And that’s it. This is the simple part of the problem.
So what’s the hard part?
The hard part is explaining the part about the Earth’s core. If you have two things that are connected to each other and rotating and one of them slows down, the other one has to speed up. I have not been able to think of an ice skater-like-metaphor to go with that, but the simple one is if you were to put a bowl of water on a lazy Susan and you spin the bowl, then the water will start to spin. It won’t spin initially, but then it will start.
If you started stirring the water in the other direction, that would slow the Lazy Susan down. And that’s the interaction between the core and the solid part of the Earth.
And is that causing the negative leap second to move back three years?
That’s why the leap second might happen at all. On a very long timescale, what’s happening is that the tides are slowing the Earth down. The Earth being slower than an atomic clock means that you need a positive leap second every so often. That was the case in 1972, when they started using leap seconds. The assumption was that the Earth would just keep slowing down and so there would be more positive leap seconds over time.
Instead, the Earth has sped up, entirely because of the core, and that’s not something that people necessarily anticipated. When you take the effect of melting ice out, it becomes clear there’s this steady deceleration of the core; the core is rotating more and more slowly. If you extrapolate that — which is a somewhat risky thing to do, you can’t really predict what the core is going to do — then you discover that there is a leap second, in 2029. The ice melting is going in the other direction; if the ice melting hadn’t occurred, then the leap second would come even earlier. Is this all making sense?
I think I’m grasping it.
Just so you know, one of the two reviewers of this paper was someone in geophysics who said, “I know all this stuff. I wasn’t familiar with the rotation part. This paper has an awful lot of moving parts.”
So, it’s just a difference of a second. Why does this even matter?
We are all familiar with the problem of not being synchronized — we just went through it. If you forget that we did Daylight Savings Time, then you’re an hour off from everybody else and it’s bewildering and a nuisance.
Same problem with leap seconds, except for us, a second is not a big deal. For a computer network, though, a second is a big deal. And why is that? Well, for example, in the United States, the rules for stock markets say that everything that is done has to be accurately timed to a 20th of a second. In Europe, it’s actually to the nearest 1,000th of a second. If we were all just farmers or something, it wouldn’t be a problem, but there’s this whole infrastructure that’s invisible to us that tells our phones what time it is, and allows GPS to work, and everything else.
The easiest thing to do would be to not have a negative leap second. Indeed, there are plans not to have leap seconds anymore because for computer networks, they’re an enormous problem. They arrive at irregular intervals; some human being has to put the information into the computer; the computer has to have a program that tells it when the leap seconds are; and most computer programs can’t tell whether it’s a plus or minus second because there’s never been a minus before. From the computer network standpoint, it would be simplest to just not do this.
So, you ask, why are we doing this? In 1972, when leap seconds were instituted, there were two communities that cared about precise time. One was the people who cared about the frequency of your radio station and other kinds of telecommunications. They wanted to use atomic clocks with frequencies that didn’t change, but that didn’t mesh with what the Earth was doing.
Who cares about time telling you how the Earth is rotating? Well, the answer then was that there were people who used the stars for celestial navigation. Back then, celestial navigation was used not just for ships, but for airplanes — if you flew across the ocean, there was a guy in the cockpit, an actual navigator, who would use a periscope to look at the stars and locate the plane, if only as sort of a backup system. That community is now gone. Almost nobody uses celestial navigation as a primary, or even a secondary, way of finding out where they are anymore because of GPS.
My own personal view — and I can warn you, there’s a huge amount of dispute about this — is that we’d be fine if we just stopped having leap seconds at all.
Is there a … governing body of time? That forces us to do leap seconds?
There’s a giant tangle of international organizations that deal with this, but the rules were set by the people in charge of keeping radio stations aligned because radio broadcasts were how time signals were distributed back in 1972. So the rule was created. Who makes that decision is something called the International Earth Rotation and Reference Systems Service, which uses astronomy to monitor what the Earth is doing. They can predict a little bit in advance where things are going to be, and if within six months things are going to be more than half a second out, they will announce there will be a leap second.
Back to climate change: It seems pretty amazing that something like melting ice can throw things off so much.
All the stuff about negative seconds is important, but it’s only important because of this infrastructure, because we have all these rules. Strip all of that away and the most important result becomes the fact that climate change has caused an amount of ice melt that is enough to change the rotation rate of the entire Earth in a way that’s visible.
How do you talk to people about the gigatons of ice that Greenland loses every year? Do you talk about “water that could cover the entire United States to the depth of X” to get it into people’s minds? Yes, these are small changes in the rotation rate, but just the fact that we can say, “Look, this is slowing down the entire Earth” seems like another way of saying that climate change is unprecedented and important.
How do we proceed, then, if climate change is messing with our system?
There was a lot of resistance to even introducing atomic time. Time was thought of as being about Earth’s rotation and the astronomers didn’t want to give it up. In fact, in the 19th century, observatories would make money by selling time signals to the rest of the community. Then, in the 1950s, the physicists showed up, ran atomic clocks, never looked at the stars, and said, “We can do time better.” The physicists were right. But it took the astronomical community a while to come around to accepting that was how time was going to be defined.
If we get rid of leap seconds then we’d really have cut the connection between the way in which human beings have always thought of time as being, say, from noon to noon, or from sunrise to sunset, and we’d be replacing it with some bunch of guys in a laboratory somewhere running a machine. For some people, it’s very troubling to think of severing the keeping of time from the Earth’s rotation.
You lose a bit of the romance, I think. But clearly, tying our way of describing the linear passage of sequential events to the Earth’s rotation is going to be messy.
Exactly right. There’s a quote from, of all people, St. Augustine, saying, “I know what time is, but if somebody asked me, I can’t tell them.”
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
After a disappointing referendum in Maine, campaigners in New York are taking their arguments straight to lawmakers.
As electricity affordability has become the issue on every politician’s lips, a coalition of New York state lawmakers and organizations in the Hudson Valley have proposed a solution: Buy the utility and operate it publicly.
Assemblymember Sarahana Shrestha, whose district covers the mid-Hudson Valley, introduced a bill early last year to buy out the Hudson Valley’s investor-owned utility, Central Hudson Gas and Electric, and run it as a state entity. That bill hung around for a while before Shrestha reintroduced it to committee in January. It now has more than a dozen co-sponsors, a sign that the idea is gaining traction in Albany.
With politicians across the country in a frenzy to quell voters’ growing anxieties over their power bills, public power advocates are seizing the moment to make a renewed case that investor-owned utilities are to blame for rising prices. A victory for public power in the Hudson Valley would be the movement’s biggest win in decades — and could serve as a blueprint for other locales.
Shrestha’s proposal, while ambitious, draws on a long history of public power campaigns in the United States, stretching from the late 1800s to the New Deal 1930s to the present. Most recently, a 2023 referendum in Maine would have seen the state take over its two largest utilities; organizers argued the move would improve service and lower rates. But as Emily Pontecorvo covered for Heatmap, Maine voters rejected the referendum by a nearly 40-point margin. Public power advocates chalked up the loss to Maine’s investor-owned utilities outspending the proposition’s supporters by more than 30 to 1.
The current Hudson Valley campaign has a lot in common with Maine’s. In both, utilities rolled out faulty billing systems that overcharged customers, fueling resentment. Both targeted utilities owned by foreign corporations (Central Hudson is owned by Fortis, a Canadian company; Central Maine Power is owned by a subsidiary of Iberdrola, a Spanish company, while Versant, another utility in the state, is a subsidiary of Enmax, a Canadian corporation). And both took place amid rate hikes.
Shrestha has spent the past year working her district, holding town halls to sell the bill to her constituents. At each one she presents the same schpiel: “I gave people a little brief story of each of the different notable fights, from Long Island Power Authority to Massena to Maine to Rochester,” she told me, “because I also want people to understand that our fight is not happening in isolation.”
Public power advocates in the Hudson Valley are certainly applying lessons from the Maine defeat to their own campaign. For one, the venue is paramount. This time, public power campaigners are gearing up for a fight in the statehouse rather than the ballot box.
Unlike a ballot proposition, state legislation typically doesn’t attract millions of dollars in television and radio advertising from deep-pocketed utilities. Sandeep Vaheesan, a legal scholar and public power expert, told me that passing a law may be a more feasible route to victory for public power.
“Legislative fights are more winnable because referenda end up being messaging wars,” Vaheesan told Heatmap. “And more often than not, the side that has money can win that war.”
The message itself is also key. One lesson Maine organizers walked away with is that affordability is a winning strategy — an insight that has only gotten more robust over the past several months.
The Climate & Community Institute, a progressive climate think tank, released a report in November reflecting on the Maine referendum that put numbers to the campaigners’ intuition. “While climate change was an issue for many in our polling,” the report states, “it often took a backseat to problems Mainers continue to experience, like rising costs and power shutoff risks.” The group also pointed me to a survey it did in the fall of 2023 — years before data centers and energy demand became top-tier political issues — in which 69% of voters said they were worried about climate change, but 85% said they were worried about energy costs.
So how could public power lower costs for ratepayers?
“If you take shareholders out of the picture — if you replace private debt with cheaper public debt — you can lower rates pretty quickly and bring energy bills down,” Vaheesan argued.
The proposed Hudson Valley Power Authority wouldn’t have a profit motive; its return on equity, currently 9.5% for Central Hudson, would be reduced to zero. As a public entity, HVPA could also access capital at much lower interest rates than a private company and would be exempt from state and federal taxes.
Investor-owned utilities also inflate customers’ bills with unnecessary capital spending, Shrestha told me.
“The only way they can drive up their profits is by expanding their capital infrastructure, which is a very rare and unique characteristic of this industry,” she said, noting that a company like Walmart can’t make a profit by overspending. “So we’re stuck with a grid that is unnecessarily bloated and cumbersome and not at all efficient.”
A feasibility report commissioned by HVPA supporters and released in December estimates that ratepayers would see their bills go down by 2% in the first year after the public takeover — and result in 14% lower bills by 2055. A competing report, issued by opponents of the legislation, claimed the delivery portion of charges could increase by 36% under HVPA due to the cost of buying out Central Hudson, though advocates criticized the report for failing to publish any data.
Hudson Valley public power supporters can take another lesson from Maine to counter a combative utility. The two Maine utilities estimated the cost for the state to acquire them would be billions of dollars more than what public power advocates estimated — though in a televised debate, an anti-referendum representative refused to defend the stated numbers until the moderator instructed her to do so.
Lucy Hochschartner, the deputy campaign manager for Pine Tree Power (Maine’s proposed state-run utility), said she often assuaged voters’ concerns over the acquisition price by comparing it to buying a house.
“Right now we pay a really high rent to [Central Maine Power],” Hochschartner told us. “We pay them more than a billion dollars in revenue a year through our electric grid. And instead we could have moved to a low-cost mortgage.”
With a public acquisition, the cost of buying the electrical and gas systems would be funded through revenue bonds, paid off through customers’ bills over time. However a spokesperson for Central Hudson, Joe Jenkins, said the company would launch a legal battle rather than agree to sell its assets to New York State.
“Fortis has made no inclination that the company is for sale,” Jenkins told me. “So to take over a company by means of eminent domain, I believe that our parents would want to see this through a court.”
While a legal battle could be costly, public power advocates say the cost of inaction is also high. Winston Yau, an energy and industrial policy manager at the Climate & Community Institute, told me that publicly run utilities are better equipped to lead the transition to carbon-free power and adapt to a warming and more turbulent climate.
“Climate disasters and extreme weather events and heat waves are a major and increasing cause of rising utility bills,” Yau said. “In the coming decades, a significant amount of new investment will be needed.”
It’s an idea with bipartisan appeal, but AOC’s former policy adviser argues that the scale of the data center problem is too big for that.
Last night, between the trumpeting of fossil fuels and the lengthy honors awarded to both veterans and hockey players, President Trump devoted a portion of his State of the Union address to announcing a “ratepayer protection pledge,” under which big tech companies pay for their own power plants for data centers — a show of how central energy prices are becoming to today’s affordability debate.
Electricity in the United States is rapidly becoming expensive and unreliable. Vast swaths of the United States are at elevated risk of outages. January’s winter storms wiped out power for millions of Americans from Louisiana to Brooklyn. In 2025, utilities requested a record $31 billion in rate increases from captive customers. Gas and electricity prices are the two highest drivers of inflation.
The main driver of these new stressors on the grid: the expected $6.7 trillion to be deployed in data centers by 2030.
Policymakers at all levels of governments are coalescing on a strategy for dealing with rising data center demand that mirrors Trump’s ratepayer protection pledge: “bring your own generation,” or BYOG. Bipartisan bills introduced in Washington by Senators Chris Van Hollen, and Josh Hawley and Richard Blumenthal; and by Representatives Rob Menendez and Greg Casar, among others, would require hyperscalers like Meta, OpenAI, and Microsoft to pay for their own power plants and grid upgrades in order to plug in. Michigan, Oregon, Florida, Washington, Georgia, Illinois, and Delaware are all at various stages of enacting BYOG legislation for data centers.
BYOG would create something like a regulatory sandbox for data centers, insulating utilities and ratepayers from the risks of data center demand. But while efforts at consumer protection are important, these policies do not grapple with the scale of data center deployment.
A sandbox won’t withstand a tidal wave. Over the next five years, the equivalent of 17 to 32 New York Cities’ worth of electricity demand is expected to be added to the grid, more than half of which will come from data centers. This incredibly wide estimate means that generators risk overbuilding.
Amidst all this uncertainty, BYOG does not address who pays for new capacity in the event the AI bubble bursts and energy infrastructure is left stranded. Neither does BYOG address the drastically mismatched lifetimes of the chips powering AI (one to three years) and power plants (25 to 30 years). The Federal Energy Regulatory Commission expects 22 New York Cities’ worth of generation to be added to the grid by 2028. Who pays for all of this generation in a decade if even 5% of projected data center demand disappears?
AI is a promising technology, but that does not prevent it from being overvalued. Policymakers must consider the risks when data centers eventually disconnect from the grid, not just when they interconnect. This means ensuring that ratepayers and taxpayers are not left footing the bill for stranded energy infrastructure if data centers disconnect prematurely.
Rather than cordoning off data centers from the rest of the electricity market, policymakers should take a stronger hand in planning these deployments for social and economic benefit. Colocating datacenters with energy-intensive industries and requiring long-term commitments from hyperscalers are more efficient solutions that would also make new data centers more politically palatable.
Public sentiment has turned overwhelmingly against data center development. These vast facilities create relatively few jobs beyond their construction, but colocated with the manufacture of energy-intensive products like aluminum, steel, or fertilizer, suddenly they’re supporting employment. Colocation will also help diversify economic growth. Data center investment was responsible for a whopping 92% of GDP growth in the first half of 2025, creating a potentially dangerous dependency on continued expansion.
There are also simple legal guardrails that can provide a first line of defense against stranded costs. One is requiring long-term power purchase agreements between hyperscalers and generators. Thirteen bipartisan governors and the Trump administration recently urged the country’s largest grid operator, PJM Interconnection, to require 15-year generation contracts for hyperscalers. Notably, Van Hollen’s bill would only require states to “consider” the extension of “minimum utility contract lengths,” while the Hawley/Blumenthal and Menendez/Casar bills make no mention of contract length or stranded costs.
Hyperscalers can also curtail usage during peak demand, a policy that has seen bipartisan support in Texas. A now-famous study from Duke University last year found that if data centers were to curtail 1% of their usage during peak hours, they could avoid installing 126 gigawatts of new generation — that’s 21 New York Cities’ worth. Lawmakers have since taken to the idea. Several states are considering mandating so-called “demand response” programs, and Representatives Alexandria Ocasio-Cortez and Kathy Castor inserted a federal study on demand response into the appropriations bill Trump signed in January.
Regardless of how it’s done, ratepayers should not pay full freight for the tidal wave of infrastructure coming online, and most utility balance sheets should not be exposed to that risk. BYOG’s flaws have more to do with what it leaves out — namely that the planning of significant parts of our economy and electric system is left to tech companies, and little thought is given to the long-term ramifications of overbuilding. Rather than deal reactively with the nasty politics of a bailout, policymakers should make muscular interventions now to reduce risks for ratepayers and taxpayers.
Energy markets are not free markets. For the past century they have been heavily regulated at the state, regional, and federal level. Any discomfort with planning (or “statutory tools”) must be set aside if policymakers are going to efficiently manage the growth of data centers.
On Cybertruck deaths, Texas wind waste, and American aluminum
Current conditions: Yet more snow is dusting New York City with at least an inch fallen already, though that’s set to turn into rain later in the morning • Authorities in Saudi Arabia issued a red alert over a major sandstorm blasting broad swaths of the desert nation • Heavy snow blanketed Romania, halting transportation and taking down power lines.

In his State of the Union address Tuesday night, President Donald Trump unveiled what he called the new “ratepayer protection pledge.” Under the effort, the White House will tell “major tech companies that they have the obligation to provide for their own power needs.” By mandating the bring-your-own-generation approach, the Trump administration is endorsing a push that’s been ongoing for months. The North American Electric Reliability Corporation, the U.S. grid watchdog, called for data centers to build their own generators. An industry-backed proposal in the nation’s largest power grid would do something similar. “This is a unique strategy,” Trump said. “We have an old grid that could never handle the [amount] of electricity that’s needed.” With tech companies constructing new power plants, Trump said, towns should welcome data center projects that could end up lowering electricity rates by inviting more power onto the local grid.
The political blowback to data centers is gaining strength. It is, as my colleague Jael Holzman wrote recently, “swallowing American politics.” On the right, Senator Josh Hawley, the populist Republican from Missouri, introduced legislation this month to restrict data center construction. On the left, Senator Bernie Sanders, the democratic socialist from Vermont, reiterated his proposal this week to halt all data center projects. In the center, Pennsylvania Governor Josh Shapiro, a Democrat with unusually strong support among his state’s GOP voters, recently outlined plans for a more “selective” approach to data centers, as I reported in this newsletter.
Trump isn’t the only Republican pushing back against the data center blowback. On Tuesday, Mississippi Governor Tate Reeves delivered an impassioned defense of his state’s data center buildout. “I understand individuals who would rather not have any industrial project in their backyard. We all choose where to live, whether it’s urban, suburban, agrarian, or industrial. I do not understand the impulse to prevent our country from advancing technologically — except as civilizational suicide,” Reeves wrote in a post on X. “I don’t want to go gently. I love this country, and want her to rise. That’s why Mississippi has become the home of the world’s most impressive supercomputers. We are committed to America and American power. We know that being the hub of the world’s most awesome technology will inevitably bring prosperity and authority to our state. There is nobody better than Mississippians to wield it.”
Replying to Sanders’ proposal, Reeves said he’s “tempted to sit back and let other states fritter away the generational chance to build. To laugh at their short-sightedness. But the best path for all of us would be to see America dominate.”
Sign up to receive Heatmap AM in your inbox every morning:
The subcompact Ford Pinto gained infamy in the 1970s for its tendency to explode when the gas tank ruptured in a crash. The Ford Motor Company sold just under 3.2 million Pintos. By the official death toll, 27 people died as a result of fires from the vehicles exploding. Tesla has sold more than 34,000 Cybertrucks; already, five people have died in fire fatalities.
That, according to a calculation by the automotive blog Fuel Arc, means the Tesla Cybertruck has 14.52 deaths per 100,000 units, compared to the Ford Pinto’s 0.85 deaths. “The Cybertruck is far more dangerous (by volume) than the historic poster child for corporate greed and grossly antagonistic design,” Fuel Arc’s Kay Leadfoot wrote. “I look forward to the Cybertruck being governmentally crash-tested by the NHTSA, which it has not been thus far. Until then, I can’t recommend sitting in one.” That is, however, based on the lower death toll figure for the Pinto. Back in 1977, Mother Jones published a blockbuster cover story under the headline “Pinto Madness” claiming that the number of deaths could be as high as 900.
Texas accused the recycling company Global Fiberglass Solutions of illegally dumping thousands of wind turbine blades near the central town of Sweetgrass. The company allegedly hired several subcontractors to break down, transport and recycle the blades, but failed to properly dispose of the waste and instead created what Windpower Monthly called a “stockpile” of more than 3,000 blades across two sites in the town. Attorney General Ken Paxton, a Republican candidate for U.S. Senate, seized on a Trumpian critique of the energy source, saying the dumps damage “beautiful Texas land and threaten surrounding communities.”
Off the Atlantic Coast, meanwhile, Orsted is at a transitional moment for two of its offshore wind projects. The Danish developer just brought the vessel Wind Scylla to port after completing the installation of turbines at its Revolution Wind project in New England. The boat is headed to New York next to start installing the first wind turbine at Sunrise Wind, according to OffshoreWIND.biz.
Last month, I told you that Century Aluminum inked a deal with Emirates Global Aluminum to build the first smelter in the U.S. in half a century in Oklahoma. On Tuesday, the U.S. Aluminum Company, a local firm in the state, joined the project, signing an agreement to “explore the development of an aluminum fabrication plant near the new smelter.” If completed, the project — already dubbed Oklahoma Primary Aluminum — would roughly double U.S. primary production of the metal.
The Biden administration had placed what Heatmap’s Matthew Zeitlin called “a big bet on aluminum” back in 2024. By spring of last year, our colleague Katie Brigham was chronicling the confusion over how Trump’s tariffs on aluminum would work. With the recent Supreme Court ruling upending Trump’s trade policies, that one may remain a headscratcher for a little while longer.
Another day, another landmark energy investment from Google. This time, the tech giant has made a deal with the long-duration energy storage startup Form Energy to deploy what Katie wrote “would be the largest battery in the world by energy capacity: an iron-air system capable of delivering 300 megawatts of power at once while storage 30 gigawatt-hours of energy, enabling continuous discharge for 100 hours straight.” The project will power a data center in Minnesota. “For all of 2025, I believe the installed capacity [added to the grid] in the entire U.S. was 57 gigawatt-hours. And in one project, we’re going to install 30 gigawatt-hours,” Form CEO Mateo Jaramillo told Katie. “What it highlights is, once you get to the 100-hour duration, you can really stop thinking about energy to some extent. “