Sign In or Create an Account.

By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy

Technology

What Does OpenAI’s New Breakthrough Mean for Energy Consumption?

Why the new “reasoning” models might gobble up more electricity — at least in the short term

A robot with a smokestack coming out of its head.
Heatmap Illustration/Getty Images

What happens when artificial intelligence takes some time to think?

The newest set of models from OpenAI, o1-mini and o1-preview, exhibit more “reasoning” than existing large language models and associated interfaces, which spit out answers to prompts almost instantaneously.

Instead, the new model will sometimes “think” for as long as a minute or two. “Through training, they learn to refine their thinking process, try different strategies, and recognize their mistakes,” OpenAI announced in a blog post last week. The company said these models perform better than their existing ones on some tasks, especially related to math and science. “This is a significant advancement and represents a new level of AI capability,” the company said.

But is it also a significant advancement in energy usage?

In the short run at least, almost certainly, as spending more time “thinking” and generating more text will require more computing power. As Erik Johannes Husom, a researcher at SINTEF Digital, a Norwegian research organization, told me, “It looks like we’re going to get another acceleration of generative AI’s carbon footprint.”

Discussion of energy use and large language models has been dominated by the gargantuan requirements for “training,” essentially running a massive set of equations through a corpus of text from the internet. This requires hardware on the scale of tens of thousands of graphical processing units and an estimated 50 gigawatt-hours of electricity to run.

Training GPT-4 cost “more than” $100 million OpenAI chief executive Sam Altman has said; the next generation models will likely cost around $1 billion, according to Anthropic chief executive Dario Amodei, a figure that might balloon to $100 billion for further generation models, according to Oracle founder Larry Ellison.

While a huge portion of these costs are hardware, the energy consumption is considerable as well. (Meta reported that when training its Llama 3 models, power would sometimes fluctuate by “tens of megawatts,” enough to power thousands of homes). It’s no wonder that OpenAI’s chief executive Sam Altman has put hundreds of millions of dollars into a fusion company.

But the models are not simply trained, they're used out in the world, generating outputs (think of what ChatGPT spits back at you). This process tends to be comparable to other common activities like streaming Netflix or using a lightbulb. This can be done with different hardware and the process is more distributed and less energy intensive.

As large language models are being developed, most computational power — and therefore most electricity — is used on training, Charlie Snell, a PhD student at University of California at Berkeley who studies artificial intelligence, told me. “For a long time training was the dominant term in computing because people weren’t using models much.” But as these models become more popular, that balance could shift.

“There will be a tipping point depending on the user load, when the total energy consumed by the inference requests is larger than the training,” said Jovan Stojkovic, a graduate student at the University of Illinois who has written about optimizing inference in large language models.

And these new reasoning models could bring that tipping point forward because of how computationally intensive they are.

“The more output a model produces, the more computations it has performed. So, long chain-of-thoughts leads to more energy consumption,” Husom of SINTEF Digital told me.

OpenAI staffers have been downright enthusiastic about the possibilities of having more time to think, seeing it as another breakthrough in artificial intelligence that could lead to subsequent breakthroughs on a range of scientific and mathematical problems. “o1 thinks for seconds, but we aim for future versions to think for hours, days, even weeks. Inference costs will be higher, but what cost would you pay for a new cancer drug? For breakthrough batteries? For a proof of the Riemann Hypothesis? AI can be more than chatbots,” OpenAI researcher Noam Brown tweeted.

But those “hours, days, even weeks” will mean more computation and “there is no doubt that the increased performance requires a lot of computation,” Husom said, along with more carbon emissions.

But Snell told me that might not be the end of the story. It’s possible that over the long term, the overall computing demands for constructing and operating large language models will remain fixed or possibly even decline.

While “the default is that as capabilities increase, demand will increase and there will be more inference,” Snell told me, “maybe we can squeeze reasoning capability into a small model ... Maybe we spend more on inference but it’s a much smaller model.”

OpenAI hints at this possibility, describing their o1-mini as “a smaller model optimized for STEM reasoning,” in contrast to other, larger models that “are pre-trained on vast datasets” and “have broad world knowledge,” which can make them “expensive and slow for real-world applications.” OpenAI is suggesting that a model can know less but think more and deliver comparable or better results to larger models — which might mean more efficient and less energy hungry large language models.

In short, thinking might use less brain power than remembering, even if you think for a very long time.

Blue

You’re out of free articles.

Subscribe today to experience Heatmap’s expert analysis 
of climate change, clean energy, and sustainability.
To continue reading
Create a free account or sign in to unlock more free articles.
or
Please enter an email address
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Energy

Trump Wants to Prop Up Coal Plants. They Keep Breaking Down.

According to a new analysis shared exclusively with Heatmap, coal’s equipment-related outage rate is about twice as high as wind’s.

Donald Trump as Sisyphus.
Heatmap Illustration/Getty Images

The Trump administration wants “beautiful clean coal” to return to its place of pride on the electric grid because, it says, wind and solar are just too unreliable. “If we want to keep the lights on and prevent blackouts from happening, then we need to keep our coal plants running. Affordable, reliable and secure energy sources are common sense,” Chris Wright said on X in July, in what has become a steady drumbeat from the administration that has sought to subsidize coal and put a regulatory straitjacket around solar and (especially) wind.

This has meant real money spent in support of existing coal plants. The administration’s emergency order to keep Michigan’s J.H. Campbell coal plant open (“to secure grid reliability”), for example, has cost ratepayers served by Michigan utility Consumers Energy some $80 million all on its own.

Keep reading...Show less
Blue
Spotlight

The New Transmission Line Pitting Trump’s Rural Fans Against His Big Tech Allies

Rural Marylanders have asked for the president’s help to oppose the data center-related development — but so far they haven’t gotten it.

Donald Trump, Maryland, and Virginia.
Heatmap Illustration/Getty Images

A transmission line in Maryland is pitting rural conservatives against Big Tech in a way that highlights the growing political sensitivities of the data center backlash. Opponents of the project want President Trump to intervene, but they’re worried he’ll ignore them — or even side with the data center developers.

The Piedmont Reliability Project would connect the Peach Bottom nuclear plant in southern Pennsylvania to electricity customers in northern Virginia, i.e.data centers, most likely. To get from A to B, the power line would have to criss-cross agricultural lands between Baltimore, Maryland and the Washington D.C. area.

Keep reading...Show less
Yellow
Hotspots

Trump Punished Wind Farms for Eagle Deaths During the Shutdown

Plus more of the week’s most important fights around renewable energy.

The United States.
Heatmap Illustration/Getty Images

1. Wayne County, Nebraska – The Trump administration fined Orsted during the government shutdown for allegedly killing bald eagles at two of its wind projects, the first indications of financial penalties for energy companies under Trump’s wind industry crackdown.

  • On November 3, Fox News published a story claiming it had “reviewed” a notice from the Fish and Wildlife Service showing that it had proposed fining Orsted more than $32,000 for dead bald eagles that were discovered last year at two of its wind projects – the Plum Creek wind farm in Wayne County and the Lincoln Land Wind facility in Morgan County, Illinois.
  • Per Fox News, the Service claims Orsted did not have incidental take permits for the two projects but came forward to the agency with the bird carcasses once it became aware of the deaths.
  • In an email to me, Orsted confirmed that it received the letter on October 29 – weeks into what became the longest government shutdown in American history.
  • This is the first action we’ve seen to date on bird impacts tied to Trump’s wind industry crackdown. If you remember, the administration sent wind developers across the country requests for records on eagle deaths from their turbines. If companies don’t have their “take” permits – i.e. permission to harm birds incidentally through their operations – they may be vulnerable to fines like these.

2. Ocean County, New Jersey – Speaking of wind, I broke news earlier this week that one of the nation’s largest renewable energy projects is now deceased: the Leading Light offshore wind project.

Keep reading...Show less
Yellow