You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
Why the new “reasoning” models might gobble up more electricity — at least in the short term

What happens when artificial intelligence takes some time to think?
The newest set of models from OpenAI, o1-mini and o1-preview, exhibit more “reasoning” than existing large language models and associated interfaces, which spit out answers to prompts almost instantaneously.
Instead, the new model will sometimes “think” for as long as a minute or two. “Through training, they learn to refine their thinking process, try different strategies, and recognize their mistakes,” OpenAI announced in a blog post last week. The company said these models perform better than their existing ones on some tasks, especially related to math and science. “This is a significant advancement and represents a new level of AI capability,” the company said.
But is it also a significant advancement in energy usage?
In the short run at least, almost certainly, as spending more time “thinking” and generating more text will require more computing power. As Erik Johannes Husom, a researcher at SINTEF Digital, a Norwegian research organization, told me, “It looks like we’re going to get another acceleration of generative AI’s carbon footprint.”
Discussion of energy use and large language models has been dominated by the gargantuan requirements for “training,” essentially running a massive set of equations through a corpus of text from the internet. This requires hardware on the scale of tens of thousands of graphical processing units and an estimated 50 gigawatt-hours of electricity to run.
Training GPT-4 cost “more than” $100 million OpenAI chief executive Sam Altman has said; the next generation models will likely cost around $1 billion, according to Anthropic chief executive Dario Amodei, a figure that might balloon to $100 billion for further generation models, according to Oracle founder Larry Ellison.
While a huge portion of these costs are hardware, the energy consumption is considerable as well. (Meta reported that when training its Llama 3 models, power would sometimes fluctuate by “tens of megawatts,” enough to power thousands of homes). It’s no wonder that OpenAI’s chief executive Sam Altman has put hundreds of millions of dollars into a fusion company.
But the models are not simply trained, they're used out in the world, generating outputs (think of what ChatGPT spits back at you). This process tends to be comparable to other common activities like streaming Netflix or using a lightbulb. This can be done with different hardware and the process is more distributed and less energy intensive.
As large language models are being developed, most computational power — and therefore most electricity — is used on training, Charlie Snell, a PhD student at University of California at Berkeley who studies artificial intelligence, told me. “For a long time training was the dominant term in computing because people weren’t using models much.” But as these models become more popular, that balance could shift.
“There will be a tipping point depending on the user load, when the total energy consumed by the inference requests is larger than the training,” said Jovan Stojkovic, a graduate student at the University of Illinois who has written about optimizing inference in large language models.
And these new reasoning models could bring that tipping point forward because of how computationally intensive they are.
“The more output a model produces, the more computations it has performed. So, long chain-of-thoughts leads to more energy consumption,” Husom of SINTEF Digital told me.
OpenAI staffers have been downright enthusiastic about the possibilities of having more time to think, seeing it as another breakthrough in artificial intelligence that could lead to subsequent breakthroughs on a range of scientific and mathematical problems. “o1 thinks for seconds, but we aim for future versions to think for hours, days, even weeks. Inference costs will be higher, but what cost would you pay for a new cancer drug? For breakthrough batteries? For a proof of the Riemann Hypothesis? AI can be more than chatbots,” OpenAI researcher Noam Brown tweeted.
But those “hours, days, even weeks” will mean more computation and “there is no doubt that the increased performance requires a lot of computation,” Husom said, along with more carbon emissions. 
But Snell told me that might not be the end of the story. It’s possible that over the long term, the overall computing demands for constructing and operating large language models will remain fixed or possibly even decline.
While “the default is that as capabilities increase, demand will increase and there will be more inference,” Snell told me, “maybe we can squeeze reasoning capability into a small model ... Maybe we spend more on inference but it’s a much smaller model.”
OpenAI hints at this possibility, describing their o1-mini as “a smaller model optimized for STEM reasoning,” in contrast to other, larger models that “are pre-trained on vast datasets” and “have broad world knowledge,” which can make them “expensive and slow for real-world applications.” OpenAI is suggesting that a model can know less but think more and deliver comparable or better results to larger models — which might mean more efficient and less energy hungry large language models.
In short, thinking might use less brain power than remembering, even if you think for a very long time.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Utilities are bending over backward to convince even their own investors that ratepayers won’t be on the hook for the cost of AI.
Utilities want you to know how little data centers will cost anyone.
With electricity prices rising faster than inflation and public backlash against data centers brewing, developers and the utilities that serve them are trying to convince the public that increasing numbers of gargantuan new projects won’t lead to higher bills. Case in point is the latest project from OpenAI’s Stargate, a $7-plus-billion, more-than-1-gigawatt data center due to be built outside Detroit.
The project was announced Thursday by Michigan Governor Gretchen Whitmer, who focused heavily on the projected economic benefits of the projects while attempting to head off criticism that it would lead to higher costs. In the first sentence of her press release, she said that the project will “create more than 2,500 union construction jobs, more than 450 jobs on site and 1,500 more across the county.” Also, it “will be one of the most advanced AI infrastructure facilities in the U.S., especially when it comes to its efficient use of land, water, and power.” Oh, and it “will not require any additional power generation to operate.”
The utility set to power the project, DTE Energy, released its quarterly earnings Thursday, as well, which described a 1.4-gigawatt project it had already executed. In a presentation for analysts and investors, DTE said that the new data center would pay for “required storage through a 15-year energy storage contract,” and that it would “support affordability for existing customers as excess capacity is sold.”
On a call with analysts, DTE Energy chief executive Joi Harris further asserted that the project has “meaningful affordability benefits to our existing customers.” As the data center ramps up, she explained, it can use existing excess capacity on the grid. By the time it reaches full strength, it will enjoy the benefits of “nearly $2 billion of incremental energy storage investments and additional tolling agreements to support this data center load.”
Who will pay for energy storage and tolling agreements? A DTE spokesperson, Jill Wilmot, clarified in an email that “DTE will meet the 1.4 gigawatts of demand from the data center with existing capacity,” and that “new energy storage will be built — and paid for by the customer” — that is, Stargate — “to help augment times of peak demand, ensuring continued reliability for all customers.”
Data centers help spread out the fixed costs of the grid more widely, Wilmot went on. “Data center development in DTE’s electric service territory will not increase customer rates,” she said, adding that “DTE is ensuring the data center will absorb all new costs required to serve them — in this case, battery storage. Our customers will not pay.”
That said, Wilmot did not answer a question about whether there would be any network or transmission upgrades necessary. She told me that she expected DTE would make a filing for the project with Michigan regulators later Friday.
Consumer advocates were skeptical of the utility’s claims. “When you are talking about new demand as massive as what would be created by this data center, we can’t afford to just take DTE at its word that other customers won’t be affected,” Amy Bandyk, the executive director of the Citizens Utility Board of Michigan, told me in an email. She called for Michigan regulators “to require DTE and the data center customer to agree on a tariff specific to that customer that includes robust protections against cost-shifting and provisions that any incremental costs will be solely covered by this new customer.”
More utilities and data center developers are trying to explicitly head off claims that data centers are driving up electricity rates. In another recent data center announcement for a multi-billion-dollar project in West Memphis, Arkansas, Google and the Arkansas Economic Development Commission said that “Google will be covering the full energy costs for the West Memphis facility and will be ramping up new solar energy and battery storage resources for the facility.”
Drew Marsh, the chief executive of Entergy, the utility serving the project, confirmed on an earnings call earlier this week that Google “will protect energy affordability for existing customers by covering the full cost of powering the data center in West Memphis.” He also said that in Mississippi, where Amazon has announced a $16 billion project, “customer rates would be 16% lower than they otherwise would have been due to these large customers.”
So why are utilities — which, after all, get paid by ratepayers for the investments they make in their systems — telling their investors about all the money they’re not charging ratepayers?
In short, utilities and developers know they’re on political thin ice, and they don’t want to kill the golden goose of data center development by stoking a populist backlash to rising electricity prices that could result in either government-mandated slashing of their investment plans, caps on the rates they can charge, or both.
“Looking ahead, we anticipate the central issue will be how utilities protect residential customers from costs associated with large-load customers, or else face potential consequences from regulators,” Mizuho analyst Anthony Crowdell said in a note to clients earlier this week. “Data centers, and their associated load, have the potential” to “cause political push-back.”
This is already happening across the country. The frontrunner in the New Jersey gubernatorial race, Democrat Mikie Sherrill, for example, has promised to freeze electricity rates, which have seen a sharp runup in recent years. Indiana Governor Mike Braun, a Republican, said in a recent statement that “we can’t take it anymore,” in reference to rate hikes. Indiana has also rejected a number of proposed data centers rejections, as I covered earlier this year.
This means that utilities will have to carefully about how and to whom they allocate costs arising from data center development and operation.
“Allocation of cost will be pivotal as the current ’pocketbook issues driving a lot of the U.S. political debate could create some challenging regulatory outcomes should data centers put pressure on customer bills,” Crowdell wrote.
But what’s said in an announcement to the media or to investors may not always reflect the reality of utility cost allocation, Harvard Law School professor Ari Peskoe told me.
“Don’t trust a utility press release or comment from a CEO of a monopoly that says Hey, these rates are good for you,” he told me.
Peskoe told me to pay close attention to the regulatory fillings utilities make for their data center projects, not just what they tell the press or investors. “Are the utilities themselves actually making these claims as strongly as their CEOs are making them in investor calls? And then once we do have a regulatory process about it, are they being transparent in that regulatory process? Are they hiding a lot of details behind the confidentiality claims so that only the participants in that proceeding actually get to see the details?”
Peskoe also pointed to other costs that might be incurred in the course of data center development that get socialized across the rate base but aren’t necessarily directly tied to any one development, like the transmission and network upgrades, that have contributed to large price increases in the PJM Interconnection territory.
“What you’re looking for is a firm contract that ensures the data center is going to be paying for every penny that the utility is incurring to provide service, so that it’s paying for all the new infrastructure that’s serving it,” Peskoe said. Without that, all you have is a press release.
The state formerly led by Interior Secretary Doug Burgum does not have a history of rejecting wind farms – which makes some recent difficulties especially noteworthy.
A wind farm in North Dakota – the former home of Interior Secretary Doug Burgum – is becoming a bellwether for the future of the sector in one of the most popular states for wind development.
At issue is Allete’s Longspur project, which would see 45 turbines span hundreds of acres in Morton County, west of Bismarck, the rural state’s most populous city.
Sited amid two already operating wind farms, the project will feed power not only to North Dakotans but also to Minnesotans, who, in the view of Allete, lack the style of open plains perfect for wind farms found in the Dakotas. Allete subsidiary Minnesota Power announced Longspur in August and is aiming to build and operate it by 2027, in time to qualify for clean electricity tax benefits under a hastened phase-out of the Inflation Reduction Act.
On paper, this sounds achievable. North Dakota is one of the nation’s largest producers of wind-generated power and not uncoincidentally boasts some of cheapest electricity in the country at a time when energy prices have become a potent political issue. Wind project rejections have happened, but they’ve been rare.
Yet last week, zoning officials in Morton County bucked the state’s wind-friendly reputation and voted to reject Longspur after more than an hour of testimony from rural residents who said they’d had enough wind development – and that officials should finish the job Donald Trump and Doug Burgum started.
Across the board, people who spoke were neighbors of existing wind projects and, if built, Longspur. It wasn’t that they didn’t want any wind turbines – or “windmills,” as they called them, echoing Trump’s nomenclature. But they didn’t want more of them. After hearing from the residents, zoning commission chair Jesse Kist came out against the project and suggested the county may have had enough wind development for now.
“I look at the area on this map and it is plum full of wind turbines, at this point,” Kist said, referencing a map where the project would be situated. “And we have a room full of people and we heard only from landowners, homeowners in opposition. Nobody in favor.”
This was a first for the county, zoning staff said, as public comment periods weren’t previously even considered necessary for a wind project. Opposition had never shown up like this before. This wasn’t lost on Andy Zachmeier, a county commissioner who also sits on the zoning panel, who confessed during the hearing that the county was approaching the point of overcrowding. “Sooner or later, when is too many enough?” he asked.
Zachmeier was ultimately one of the two officials on the commission to vote against rejecting Longspur. He told me he was looking to Burgum for a signal.
“The Green New Deal – I don’t have to like it but it’s there,” he said. “Governor Burgum is now our interior secretary. There’s been no press conferences by him telling the president to change the Green New Deal.” Zachmeier said it was not the county’s place to stop the project, but rather that it was up to the state government, a body Burgum once led. “That’s probably going to have to be a legislative question. There’s been nothing brought forward where the county can say, We’ve been inundated and we’ve had enough,” he told me.
The county commission oversees the zoning body, and on Wednesday, Zachmeier and his colleagues voted to deny Longspur’s rejection and requested that zoning officials reconsider whether the denial was a good idea, or even legally possible. Unlike at the hearing last week, landowners whose property includes the wind project area called for it to proceed, pointing to the monetary benefits its construction would provide them.
“We appreciate the strong support demonstrated by landowners at the recent Commission meeting,” Allete’s corporate communications director Amy Rutledge told me in an email. “This region of North Dakota combines exceptional wind resources, reliable electric transmission infrastructure, and a strong tradition of coexisting seamlessly with farming and ranching activities.”
I personally doubt that will be the end of Longspur’s problems before the zoning board, and I suspect this county will eventually restrict or even ban future wind projects. Morton County’s profile for renewables development is difficult, to say the least; Heatmap Pro’s modeling gives the county an opposition risk score of 92 because it’s a relatively affluent agricultural community with a proclivity for cultural conservatism – precisely the kind of bent that can be easily swayed by rhetoric from Trump and his appointees.
Morton County also has a proclivity for targeting advanced tech-focused industrial development. Not only have county officials instituted a moratorium on direct air capture facilities, they’ve also banned future data center and cryptocurrency mining projects.
Neighboring counties have also restricted some forms of wind energy infrastructure. McClean County to the north, for example, has instituted a mandatory wind turbine setback from the Missouri River, and Stark County to the west has a 2,000-foot property setback from homes and public buildings.
In other words, so goes Burgum, may go North Dakota? I suppose we’ll find out.
And more of the week’s top news about renewable energy conflicts.
1. Staten Island, New York – New York’s largest battery project, Swiftsure, is dead after fervent opposition from locals in what would’ve been its host community, Staten Island.
2. Barren County, Kentucky – Do you remember Wood Duck, the solar farm being fought by the National Park Service? Geenex, the solar developer, claims the Park Service has actually given it the all-clear.
3. Near Moss Landing, California – Two different communities near the now-infamous Moss Landing battery site are pressing for more restrictions on storage projects.
4. Navajo County, Arizona – If good news is what you’re seeking, this Arizona county just approved a large solar project, indicating this state still has sunny prospects for utility-scale development depending on where you go.
5. Gillespie County, Texas – Meanwhile out in Texas, this county is getting aggressive in its attempts to kill a battery storage project.
6. Clinton County, Iowa – This county just extended its moratorium on wind development until at least the end of the year as it drafts a restrictive ordinance.