You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:

Twenty-five years ago, computers were on the verge of destroying America’s energy system.
Or, at least, that’s what lots of smart people seemed to think.
In a 1999 Forbes article, a pair of conservative lawyers, Peter Huber and Mark Mills, warned that personal computers and the internet were about to overwhelm the fragile U.S. grid.
Information technology already devoured 8% to 13% of total U.S. power demand, Huber and Mills claimed, and that share would only rise over time. “It’s now reasonable to project,” they wrote, “that half of the electric grid will be powering the digital-Internet economy within the next decade.” (Emphasis mine.)
Over the next 18 months, investment banks including JP Morgan and Credit Suisse repeated the Forbes estimate of internet-driven power demand, advising their customers to pile into utilities and other electricity-adjacent stocks. Although it was unrelated, California’s simultaneous blackout crisis deepened the sense of panic. For a moment, experts were convinced: Data centers and computers would drain the country’s energy resources.
They could not have been more wrong. In fact, Huber and Mills had drastically mismeasured the amount of electricity used by PCs and the internet. Computing ate up perhaps 3% of total U.S. electricity in 1999, not the roughly 10% they had claimed. And instead of staring down a period of explosive growth, the U.S. electric grid was in reality facing a long stagnation. Over the next two decades, America’s electricity demand did not grow rapidly — or even, really, at all. Instead, it flatlined for the first time since World War II. The 2000s and 2010s were the first decades without “load growth,” the utility industry’s jargon for rising power demand, since perhaps the discovery of electricity itself.
Now that lull is ending — and a new wave of tech-driven concerns has overtaken the electricity industry. According to its supporters and critics alike, generative artificial intelligence like ChatGPT is about to devour huge amounts of electricity, enough to threaten the grid itself. “We still don’t appreciate the energy needs of this technology,” Sam Altman, the CEO of OpenAI, has said, arguing that the world needs a clean energy breakthrough to meet AI’s voracious energy needs. (He is investing in nuclear fusion and fission companies to meet this demand.) The Washington Post captured the zeitgeist with a recent story: America, it said, “is running out of power.”
But … is it actually? There is no question that America’s electricity demand is rising once again and that load growth, long in abeyance, has finally returned to the grid: The boom in new factories and the ongoing adoption of electric vehicles will see to that. And you shouldn’t bet against the continued growth of data centers, which have increased in size and number since the 1990s. But there is surprisingly little evidence that AI, specifically, is driving surging electricity demand. And there are big risks — for utility customers and for the planet — by treating AI-driven electricity demand as an emergency.
There is, to be clear, no shortage of predictions that AI will cause electricity demand to rise. According to a recent Reuters report, nine of the country’s 10 largest utilities are now citing the “surge” in power demand from data centers when arguing to regulators that they should build more power. Morgan Stanley projects that power use from data centers “is expected to triple globally this year,” according to the same report. The International Energy Agency more modestly — but still shockingly — suggests that electricity use from data centers, AI, and cryptocurrency could double by 2026.
These concerns have also come from environmentalists. A recent report from the Climate Action Against Disinformation Commission, a left-wing alliance of groups including Friends of the Earth and Greenpeace, warned that AI will require “massive amounts of energy and water” and called for aggressive regulation.
That report focused on the risks of an AI-addled social media public sphere, which progressives fear will be filled with climate-change-denying propaganda by AI-powered bots. But in an interview, Michael Khoo, an author of the report and a researcher at Friends of the Earth, told me that studying AI made him much more frightened about its energy use.
AI is such an power-suck that it “is causing America to run out of energy,” Khoo said. “I think that’s going to be much more disruptive than the disinformation conversation in the mid-term.” He sketched a scenario where Altman and Mark Zuckerberg can outbid ordinary households for electrons as AI proliferates across the economy. “I can see people going without power,” he said, “and there being massive social unrest.”
These predictions aren’t happening in a vacuum. At the same time that investment bankers and environmentalists have fretted over a potential electricity shortage, utilities across the South have proposed a de facto solution: a massive buildout of new natural-gas power plants.
Citing the return of load growth, utilities across the South are trying to go around normal regulatory channels and build a slew of new natural-gas-burning power plants. Across at least six states, utilities have already won — or are trying to win — permission from local governments to fast-track more than 10,000 megawatts of new gas-fired power plants so that they can meet the surge in demand.
These requests have popped up across the region, pushed by vertically integrated monopoly power companies. Georgia Power won a tentative agreement to build 1,400 new megawatts of gas capacity, Canary reported. In the Carolinas, Duke Energy has asked to build 9,000 megawatts of new gas capacity, triple what it previously requested. The Tennessee Valley Authority has plans to add 6,600 megawatts of new capacity to its grid.
This buildout is big enough to endanger the country’s climate targets. Although these utilities are also building new renewable and battery farms, and shutting down coal plants, the planned surge in carbon emissions from natural gas plants would erase the reductions from those changes, according to a Southern Environmental Law Center analysis. Duke Energy has already said that it will not meet its 2030 climate goal in order to conduct the gas expansion.
In the popular press, AI’s voracious energy demand is sometimes said to be a major driver of this planned gas boom. But evidence for that proposition is slim, and the utilities have said only that data center expansion is one of several reasons for the boom. The Southeast’s population is growing, and the region is experiencing a manufacturing renaissance, due in part to the new car, battery, and solar panel factories subsidized by Biden’s climate law. Utilities in the South also face a particular challenge coping with the coldest winter mornings because so many homes and offices use inefficient and power-hungry space heaters.
Indeed, it’s hard to talk about the drivers of load growth with any specificity — and it’s hard to know whether load growth will actually happen in all corners of the South.
Utilities compete against each other to secure big-name customers — much like local governments compete with sweetheart tax deals — so when a utility asks regulators to build more capacity, it doesn’t reveal where potential power demand is coming from. (In other words, it doesn’t reveal who it believes will eventually buy that power.) A company might float plans to build the same data center or factory in multiple states to shop around for the best rates, which means the same underlying gigawatts of demand may be appearing in several different utilities’ resource plans at the same time. In other words, utilities are unlikely to actually see all of the demand they’re now projecting.
Even if we did know exactly how many gigawatts of new demand each utility would see, it’s almost impossible to say how much of it is coming from AI. Utilities don’t say how much of their future projected power demand will come from planned factories versus data centers. Nor do they say what each data center does and whether it trains AI (or mines Bitcoin, which remains a far bigger energy suck).
The risk of focusing on AI, specifically, as a driver of load growth is that because it’s a hot new technology — one with national security implications, no less — it can rhetorically justify expensive emergency action that is actually not necessary at all. Utilities may very well need to build more power capacity in the years to come. But does that need constitute an emergency? Does it justify seeking special permission from their statehouses or regulators to build more gas, instead of going through the regular planning process? Is it worth accelerating approvals for new gas plants? Probably not. The real danger, in other words, is not that we’ll run out of power. It’s that we’ll build too much of the wrong kind.
At the same time, we might have been led astray by overly dire predictions of AI’s energy use. Jonathan Koomey, a researcher who studies how the internet and data centers use energy (and the namesake of Koomey’s Law) told me that many estimates of Nvidia’s most important AI chips assume that their energy use is the same as their advertised “rated” power. In reality, Nvidia chips probably use half of that amount, he said, because chipmakers engineer their chips to withstand more electricity than is necessary for safety reasons.
And this is just the current generation of chips: Nvidia’s next generation of AI-training chips, called “Blackwell,” use 25 times less energy to do the same amount of computation as the previous generation of chips.
Koomey helped defuse the last panic over energy use by showing that the estimates Huber and Mills relied on were wildly incorrect. Estimates now suggest that the internet used less than 1% of total U.S. electricity by the late 1990s, not 13% as they claimed. Those percentages stayed roughly the same through 2008, he later found, even as data centers grew and computers proliferated across the economy. That’s the same year, remember, that Huber and Mills predicted that the internet would consume half of American energy.
These bad predictions were extremely convenient. Mills was a scientific advisor to the Greening Earth Society, a fossil-fuel-industry-funded group that alleged carbon dioxide pollution would actually improve the global environment. He aimed to show that climate and environmental policy would conflict with the continued growth of the internet.
“Many electricity policy proposals are on a collision course with demand forces,” Mills said in a Greening Earth press release at the time. “While many environmentalists want to substantially reduce coal use in making electricity, there is no chance of meeting future economically-driven and Internet-accelerated electric demand without retaining and expanding the coal component.” Hence the headline of the Forbes piece: “The PCs are coming — Dig more coal.”
What makes today’s AI-induced fear frenzy different from 1999 is that the alarmed projections are not just coming from businesses and banks like Morgan Stanley, but from environmentalists like Friends of the Earth. Yet neither their estimates of near-term, AI-driven power shortages — nor the analysis from Morgan Stanley that U.S. data-center use could soon triple within a year — make sense given what we know about data centers, Koomey said. It is not logistically possible to triple data centers’ electricity use in one year. “There just aren’t enough people to build data centers, and it takes longer than a year to build a new data center anyway,” he said. “There aren’t enough generators, there aren’t enough transformers — the backlog for some equipment is 24 months. It’s a supply chain constraint.”
Look around and you might notice that we have many more servers and computers today than we did in 1999 — not to mention smartphones and tablets, which didn’t even exist then — and yet computing doesn’t devour half of American energy. It doesn’t even get close. Today, computers use 1% to 4% of total U.S. power demand, depending on which estimate you trust. That’s about the same share of total U.S. electricity demand that they used in the late 1990s and mid-2000s.
It may well be that AI devours more energy in years to come, but utilities probably do not need to deal with it by building more gas. They could install more batteries, build new power lines, or even pay some customers to reduce their electricity usage during certain peak events, such as cold winter storms.
There are some places where AI-driven energy demand could be a problem — Koomey cited Ireland and Loudon County, Virginia, as two epicenters. But even there, building more natural gas is not the sole way to cope with load growth.
“The problem with this debate is everybody is kind of right,” Daniel Tait, who researches Southern utilities for the Energy and Policy Institute, a consumer watchdog, told me. “Yes, AI will increase load a little bit, but probably not as much as you think. Yes, load is growing, but maybe not as much as you say. Yes, we do need to build stuff, but maybe not the stuff that you want.”
There are real risks if AI’s energy demands get overstated and utilities go on a gas-driven bender. The first is for the planet: Utilities might overbuild gas plants now, run them even though they’re non-economic, and blow through their climate goals.
“Utilities — especially the vertically integrated monopoles in the South — have every incentive to overstate load growth, and they have a pattern of having done that consistently,” Gudrun Thompson, a senior attorney at the Southern Environmental Law Center, told me. In 2017, the Rocky Mountain Institute, an energy think tank, found in 2017 that utilities systematically overestimated their peak demand when compiling forecasts. This makes sense: Utilities would rather build too much capacity than wind up with too little, especially when they can pass along the associated costs to rate-payers.
But the second risk is that utilities could burn through the public’s willingness to pay for grid upgrades. Over the next few years, utilities should make dozens of updates to their systems. They have to build new renewables, new batteries, and new clean 24/7 power, such as nuclear or geothermal. They will have to link their grids to their neighbors’ by building new transmission lines. All of that will be expensive, and it could require the kind of investment that raises electricity rates. But the public and politicians can accept only so many rate hikes before they rebel, and there’s a risk that utilities spend through that fuzzy budget on unnecessary and wasteful projects now, not on the projects that they’ll need in the future.
There is no question that AI will use more electricity in the years to come. But so will EVs, new factories, and other sources of demand. America is on track to use more electricity. If that becomes a crisis, it will be one of our own making.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Microreactor maker Antares Nuclear just struck a deal with BWXT Technologies to produce TRISO.
Long before the infamous trio of accidents at Three Mile Island, Chernobyl, and Fukushima, nuclear scientists started working on a new type of fuel that would make a meltdown nearly impossible. The result was “tri-structural isotropic” fuel, better known as TRISO.
The fuel encased enriched uranium kernels in three layers of ceramic coating designed to absorb the super hot, highly radioactive waste byproducts that form during the atom-splitting process. In theory, these poppyseed-sized pellets could have negated the need for the giant concrete containment vessels that cordon off reactors from the outside world. But TRISO was expensive to produce, and by the 1960s, the cheaper low-enriched uranium had proved reliable enough to become the industry standard around the globe.
TRISO had another upside, however. The cladding protected the nuclear material from reaching temperatures high enough that could risk a meltdown. That meant reactors using them could safely operate at hotter temperatures. When the United States opened its first commercial high-temperature gas-cooled reactor in 1979, barely three months after Three Mile Island, the Fort St. Vrain Generating Station in Colorado ran on TRISO. It was a short-lived experiment. After a decade, the high cost of the fuel and the technical challenges of operating the lone commercial atomic station in the U.S. that didn’t use water as a coolant forced Fort St. Vrain to close. TRISO joined the long list of nuclear technologies that worked, but didn’t pencil out on paper.
Now it’s poised for a comeback. X-energy, the nuclear startup backed by Amazon that plans to cool its 80-megawatt microreactors with helium, is building out a production line to produce its own TRISO fuel in hopes of generating both electricity for data centers and heat as hot as 1,400 degrees Fahrenheit for Dow Chemical’s petrochemical facilities. Kairos Power, the Google-backed rival with the country’s only deal to sell power from a fourth-generation nuclear technology — reactors designed to use coolants other than water — to a utility, is procuring TRISO for its molten fluoride salt-cooled microreactors, which are expected to generate 75 megawatts of electricity and reach temperatures above 1,200 degrees.
Then there’s Antares Nuclear. The California-based startup is designing 1-megawatt reactors cooled through sodium pipes that conduct heat away from the atom-splitting core. On Thursday, the company is set to announce a deal with the U.S. government-backed nuclear fuel enricher BWXT Technologies to establish a new production line for TRISO to fuel Antares reactors, Heatmap has learned exclusively.
Unlike X-energy or Kairos, Antares isn’t looking to sell electricity to utilities and server farms. Instead, the customers the company has in mind are the types for whom the price of fuel is secondary to how well it functions under extraordinary conditions.
“We’re putting nuclear power in space,” Jordan Bramble, Antares’ chief executive, told me from his office outside Los Angeles.
Just last month, NASA and the Department of Energy announced plans to develop a nuclear power plant on the moon by the end of the decade. The U.S. military, meanwhile, is seeking microreactors that can free remote bases and outposts from the tricky, expensive task of maintaining fossil fuel supply chains. Antares wants to compete for contracts with both agencies.
“It’s a market where cost matters, but cost is not the north star,” Bramble said.
Unlike utilities, he said, “you’re not thinking of cost solely in terms of fuel cycle, but you’re thinking of cost holistically at the system level.” In other words, TRISO may never come as cheap as traditional fuel, but something that operates safely and reliably in extreme conditions ends up paying for itself over time with spacecrafts and missile-defense systems that work as planned and don’t require replacement.
That’s a familiar market for BWXT. The company — spun out in 2015 from Babcock and Wilcox, the reactor developer that built more than half a dozen nuclear plants for the U.S. during the 20th century — already enriches the bulk of the fuel for the U.S. military’s fleet of nuclear submarines, granting BWXT the industry’s highest-possible security clearance to work on federal contracts.
But BWXT, already the country’s leading producer of TRISO, sees an even wider market for the fuel.
“The value is that it allows you to operate at really high temperatures where you get high efficiencies,” Joseph Miller, BWXT’s president of government operations, told me. “We already have a lot of customer intrigue from the mining industry. I can see the same thing for synthetic fuels and desalination.”
BWXT isn’t alone in producing TRISO. Last month, the startup Standard Nuclear raised $140 million in a Series A round to build out its supply chain for producing TRISO. X-energy is establishing its own production line through a subsidiary called TRISO-X. And that’s just in the U.S. Russia’s state-owned nuclear company, Rosatom, is ramping up production of TRISO. China, which operates the world’s only commercial high-temperature gas-cooled reactor at the moment, also generates its own TRISO fuel.
Beijing’s plans for a second reactor based on that fourth-generation design could indicate a problem for the U.S. market: TRISO may work better in larger reactors, and America is only going for micro-scale units.
The world-leading high-temperature gas reactor China debuted in December 2023 maxes out at 210 megawatts of electricity. But the second high-temperature gas reactor under development is more than three times as powerful, with a capacity of 660 megawatts. At that size, the ultra-high temperatures a gas reactor can reach mean it takes longer for the coolant — such as the helium used at Fort St. Vrain — to remove heat. As a result, “you need this robust fuel form that releases very little radioactivity during normal operation and in accident conditions,” Koroush Shirvan, a researcher who studies advanced nuclear technologies at the Massachusetts Institute of Technology, told me.
But microreactors cool down faster because there’s less fuel undergoing fission in the core. “Once you get below a certain power level,” Shrivan said, “why would you have [TRISO]?”
Given the military and space applications Antares is targeting, however, where the added safety and functionality of TRISO merits the higher cost associated with using it, the company has a better use case than some of its rivals, Shrivan added.
David Petti, a former federal researcher who is one of the leading U.S. experts on TRISO, told me that when the government was testing TRISO for demonstration reactors, the price was at least double that of traditional reactor fuel. “That’s probably the best you could do,” he said in reference to the cost differential.
There are other uranium blends inside the TRISO pellets that could prove more efficient. The Chinese, for example, use uranium dioxide, essentially just an encased version of traditional reactor fuel. The U.S., by contrast, uses uranium oxycarbide, which allows for increased temperatures and higher burnups of the enriched fuel. Another option, which Bramble said he envisions Antares using in the future, would be uranium nitride, which has a greater density of fuel and could therefore last longer in smaller reactors used in space.
“But it’s not as tested in a TRISO system,” Petti said, noting that the federal research program that bolstered the TRISO efforts going on now started in 2002. “Until I see a good test that it’s good, the time and effort it takes to qualify is complicated.”
Since the uranium in TRISO is typically enriched to higher levels than standard fuel, BWXT’s facilities are subject to stricter safety rules, which adds “significant overhead,” Petti said.
“When you make a lot of fuel per year in your fuel factory, you can spread that cost and you can get a number that may be economic,” he said. “When you have small microreactors, you’re not producing an awful lot. You have to take that cost and charge it to the customer.”
BWXT is bullish on the potential for its customer base to grow significantly in the coming years. The company is negotiating a deal with the government of Wyoming to open a new factory there entirely dedicated to TRISO production. While he wouldn’t give specifics just yet, Miller told me BWXT is developing new technologies that can make TRISO production cheaper. He compared the cost curve to that of microchips, an industry in which he previously worked.
“Semiconductors were super expensive to manufacture. They were almost cost prohibitive,” Miller said. “But the cost curve starts to drop rapidly when you fully understand the manufacturing process and you know how to integrate the understanding into operational improvements.”
He leaned back in his chair on our Zoom call, and cracked a smile. “Frankly,” he said, “I feel more confident every day that we’re going to get a really, really cost driven formula on how to manufacture TRISO.”
The startup — founded by the former head of Tesla Energy — is trying to solve a fundamental coordination problem on the grid.
The concept of virtual power plants has been kicking around for decades. Coordinating a network of distributed energy resources — think solar panels, batteries, and smart appliances — to operate like a single power plant upends our notion of what grid-scale electricity generation can look like, not to mention the role individual consumers can play. But the idea only began taking slow, stuttering steps from theory to practice once homeowners started pairing rooftop solar with home batteries in the past decade.
Now, enthusiasm is accelerating as extreme weather, electricity load growth, and increased renewables penetration are straining the grid and interconnection queue. And the money is starting to pour in. Today, home battery manufacturer and VPP software company Lunar Energy announced $232 million in new funding — a $102 million Series D round, plus a previously unannounced $130 million Series C — to help deploy its integrated hardware and software systems across the U.S.
The company’s CEO, Kunal Girotra, founded Lunar Energy in the summer of 2020 after leaving his job as head of Tesla Energy, which makes the Tesla Powerwall battery for homeowners and the Megapack for grid-scale storage. As he put it, back then, “everybody was focused on either building the next best electric car or solving problems for the grid at a centralized level.” But he was more interested in what was happening with households as home battery costs were declining. “The vision was, how can we get every home a battery system and with smart software, optimize that for dual benefit for the consumer as well as the grid?”
VPPs work by linking together lots of small energy resources. Most commonly, this includes solar, home batteries, and appliances that can be programmed to adjust their energy usage based on grid conditions. These disparate resources work in concert conducted by software that coordinates when they should charge, discharge, or ramp down their electricity use based on grid needs and electricity prices. So if a network of home batteries all dispatched energy to the grid at once, that would have the same effect as firing up a fossil fuel power plant — just much cleaner.
Lunar’s artificial intelligence-enabled home energy system analyzes customers’ energy use patterns alongside grid and weather conditions. That allows Lunar’s battery to automatically charge and discharge at the most cost-effective times while retaining an adequate supply of backup power. The batteries, which started shipping in California last year, also come integrated with the company’s Gridshare software. Used by energy companies and utilities, Gridshare already manages all of Sunrun’s VPPs, including nearly 130,000 home batteries — most from non-Lunar manufacturers — that can dispatch energy when the grid needs it most.
This accords with Lunar’s broader philosophy, Girotra explained — that its batteries should be interoperable with all grid software, and its Gridshare platform interoperable with all batteries, whether they’re made by Lunar or not. “That’s another differentiator from Tesla or Enphase, who are creating these walled gardens,” he told me. “We believe an Android-like software strategy is necessary for the grid to really prosper.” That should make it easier for utilities to support VPPs in an environment where there are more and more differentiated home batteries and software systems out there.
And yet the real-world impact of VPPs remains limited today. That’s partially due to the main problem Lunar is trying to solve — the technical complexity of coordinating thousands of household-level systems. But there are also regulatory barriers and entrenched utility business models to contend with, since the grid simply wasn’t set up for households to be energy providers as well as consumers.
Girotra is well-versed in the difficulties of this space. When he first started at Tesla a decade ago, he helped kick off what’s widely considered to be the country’s first VPP with Green Mountain Power in Vermont. The forward-looking utility was keen to provide customers with utility-owned Tesla Powerwalls, networking them together to lower peak system demand. But larger VPPs that utilize customer-owned assets and seek to sell energy from residential batteries into wholesale electricity markets — as Lunar wants to do — are a different beast entirely.
Girotra thinks their time has come. “This year and the next five years are going to be big for VPPs,” he told me. The tide started to turn in California last summer, he said, after a successful test of the state’s VPP capacity had over 100,000 residential batteries dispatching more than 500 megawatts of power to the grid for two hours — enough to power about half of San Francisco. This led to a significant reduction in electricity demand during the state’s evening peak, with the VPP behaving just like a traditional power plant.
Armed with this demonstration of potential and its recent influx of cash, Lunar aims to scale its battery fleet, growing from about 2,000 deployed systems today to about 10,000 by year’s end, and “at least doubling” every year after that. Ultimately, the company aims to leverage the popularity of its Gridshare platform to become a market maker, helping to shape the structure of VPP programs — as it’s already doing with the Community Choice Aggregators that it’s partnered with so far in California.
In the meantime, Girotra said Lunar is also involved in lobbying efforts to push state governments and utilities to make it easier for VPPs to participate in the market. “VPPs were always like nuclear fusion, always for the future,” he told me. But especially after last year’s demonstration, he thinks the entire grid ecosystem, from system operators to regulators, are starting to realize that the technology is here today. ”This is not small potatoes anymore.”
If all the snow and ice over the past week has you fed up, you might consider moving to San Francisco, Los Angeles, Phoenix, Austin, or Atlanta. These five cities receive little to no measurable snow in a given year; subtropical Atlanta technically gets the most — maybe a couple of inches per winter, though often none. Even this weekend’s bomb cyclone, which dumped 7 inches across parts of northeastern Georgia, left the Atlanta suburbs with too little accumulation even to make a snowman.
San Francisco and the aforementioned Sun Belt cities are also the five pilot locations of the all-electric autonomous-vehicle company Waymo. That’s no coincidence. “There is no commercial [automated driving] service operating in winter conditions or freezing rain,” Steven Waslander, a University of Toronto robotics professor who leads WinTOR, a research program aimed at extending the seasonality of self-driving cars, told me. “We don’t have it completely solved.”
Snow and freezing rain, in particular, are among the most hazardous driving conditions, and 70% of the U.S. population lives in areas that experience such conditions in winter. But for the same reasons snow and ice are difficult for human drivers — reduced visibility, poor traction, and a greater need to react quickly and instinctively in anticipation of something like black ice or a fishtailing vehicle in an adjacent lane — they’re difficult for machines to manage, too.
The technology that enables self-driving cars to “see” the road and anticipate hazards ahead comes in three varieties. Tesla Autopilot uses cameras, which Tesla CEO Elon Musk has lauded for operating naturally, like a human driver’s eye — but they have the same limitations as a human eye when conditions deteriorate, too.
Lidar, used by Waymo and, soon, Rivian, deploys pulses of light that bounce off objects and return to sensors to create 3D images of the surrounding environment. Lidar struggles in snowy conditions because the sensors also absorb airborne particles, including moisture and flakes. (Not to mention, lidar is up to 32 times more expensive than Tesla’s comparatively simple, inexpensive cameras.) Radar, the third option, isn’t affected by darkness, snow, fog, or rain, using long radio wavelengths that essentially bend around water droplets in the air. But it also has the worst resolution of the bunch — it’s good at detecting cars, but not smaller objects, such as blown tire debris — and typically needs to be used alongside another sensor, like lidar, as it is on Waymo cars.
Driving in the snow is still “definitely out of the domain of the current robotaxis from Waymo or Baidu, and the long-haul trucks are not testing those conditions yet at all,” Waslander said. “But our research has shown that a lot of the winter conditions are reasonably manageable.”
To boot, Waymo is now testing its vehicles in Tokyo and London, with Denver, Colorado, set to become the first true “winter city” for the company. Waymo also has ambitions to expand into New York City, which received nearly 12 inches of snow last week during Winter Storm Fern.
But while scientists are still divided on whether climate change is increasing instances of polar vortices — which push extremely cold Arctic air down into the warmer, moister air over the U.S., resulting in heavy snowfall — we do know that as the planet warms, places that used to freeze solid all winter will go through freeze-thaw-refreeze cycles that make driving more dangerous. Freezing rain, which requires both warm and cold air to form, could also increase in frequency. Variability also means that autonomous vehicles will need to navigate these conditions even in presumed-mild climates such as Georgia.
Snow and ice throw a couple of wrenches at autonomous vehicles. Cars need to be taught how to brake or slow down on slush, soft snow, packed snow, melting snow, ice — every variation of winter road condition. Other drivers and pedestrians also behave differently in snow than in clear weather, which machine learning models must incorporate. The car itself will also behave differently, with traction changing at critical moments, such as when approaching an intersection or crosswalk.
Expanding the datasets (or “experience”) of autonomous vehicles will help solve the problem on the technological side. But reduced sensor accuracy remains a big concern — because you can only react to hazards you can identify in the first place. A crust of ice over a camera or lidar sensor can prevent the equipment from working properly, which is a scary thought when no one’s in the driver’s seat.
As Waslander alluded to, there are a few obvious coping mechanisms for robotaxi and autonomous vehicle makers: You can defrost, thaw, wipe, or apply a coating to a sensor to keep it clear. Or you can choose something altogether different.
Recently, a fourth kind of sensor has entered the market. At CES in January, the company Teradar demonstrated its Summit sensor, which operates in the terahertz band of the electromagnetic spectrum, a “Goldilocks” zone between the visible light used by cameras and the human eye and radar. “We have all the advantages of radar combined with all the advantages of lidar or camera,” Gunnar Juergens, the SVP of product at Teradar, told me. “It means we get into very high resolution, and we have a very high robustness against any weather influence.”
The company, which raised $150 million in a Series B funding round last year, says it is in talks with top U.S. and European automakers, with the goal of making it onto a 2028 model vehicle; Juergens also told me the company imagines possible applications in the defense, agriculture, and health-care spaces. Waslander hadn’t heard of Teradar before I told him about it, but called the technology a “super neat idea” that could prove to be a “really useful sensor” if it is indeed able to capture the advantages of both radar and lidar. “You could imagine replacing both with one unit,” he said.
Still, radar and lidar are well-established technologies with decades of development behind them, and “there’s a reason” automakers rely on them, Waslander told me. Using the terahertz band, “there’s got to be some trade-offs,” he speculated, such as lower measurement accuracy or higher absorption rates. In other words, while Teradar boasts the upsides of both radar and lidar, it may come with some of their downsides, too.
Another point in Teradar’s favor is that it doesn’t use a lens at all — there’s nothing to fog, freeze, or salt over. The sensor could help address a fundamental assumption of autonomy — as Juergen put it, “if you transfer responsibility from the human to a machine, it must be better than a human.” There are “very good solutions on the road,” he went on. “The question is, can they handle every weather or every use case? And the answer is no, they cannot.” Until sensors can demonstrate matching or exceeding human performance in snowy conditions — whether through a combination of lidar, cameras, and radar, or through a new technology such as Teradar’s Summit sensor — this will remain true.
If driving in winter weather can eventually be automated at scale, it could theoretically save thousands of lives. Until then, you might still consider using that empty parking lot nearby to brush up on your brake pumping.
Otherwise, there’s always Phoenix; I’ve heard it’s pleasant this time of year.