You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
New rules governing how companies report their scope 2 emissions have pit tech giant against tech giant and scholars against each other.

All summer, as the repeal of wind and solar tax credits and the surging power demands of data centers captured the spotlight, a more obscure but equally significant clean energy fight was unfolding in the background. Sustainability executives, academics, and carbon accounting experts have been sparring for months over how businesses should measure their electricity emissions.
The outcome could be just as consequential for shaping renewable energy markets and cleaning up the power grid as the aforementioned subsidies — perhaps even more so because those subsidies are going away. It will influence where and how — and potentially even whether — companies continue to voluntarily invest in clean energy. It has pitted tech heavyweights like Google and Microsoft against peers Meta and Amazon, all of which are racing each other to power their artificial intelligence operations without abandoning their sustainability commitments. And it could affect the pace of emissions reductions for decades to come.
In essence, the fight is over how to appraise the climate benefits of companies’ clean power purchases. The arena is the Greenhouse Gas Protocol, a nonprofit that creates voluntary emissions reporting standards. Companies use these standards to calculate emissions from their direct operations, from the electricity and gas that powers and heats their buildings, and from their supply chains. If you’ve ever seen a brand claim it “runs on 100% renewable energy,” that statement is likely backed by a Greenhouse Gas Protocol-sanctioned methodology.
For years, however, critics have poked holes in the group’s accounting rules and assumptions, charging it with enabling greenwashing. In response, the organization has decided to overhaul its standards, including for how companies should measure their electricity footprint, known as “scope 2” emissions.
The Greenhouse Gas Protocol first convened a technical working group to revise its Scope 2 Standard last September. By late June, the group had finalized a draft proposal with more rigorous criteria for clean energy claims, despite intense pushback on the underlying direction from companies and clean energy groups.
A flurry of op-eds, essays, and LinkedIn posts accused the working group of being on the “wrong track,” and called the proposal a “disaster” with “unintended consequences.” The Clean Energy Buyers Association, a trade group, penned a letter saying it was “inefficient and infeasible for most buyers and may curtail ambitious global climate action.” Similarly, the American Council on Renewable Energy warned that the plan “could unintentionally chill investment and growth in the clean energy sector.”
Next the draft will face a 60-day public consultation period that begins in early October. “There’ll be pushback from every direction,” Matthew Brander, a professor of carbon accounting at the University of Edinburgh and a member of the Scope 2 Working Group, told me. Ultimately, it will be up to the Working Group, the Protocol’s Independent Standards Board, and its Steering Committee, to decide whether the proposal will be adopted or significantly revised.
The challenge of creating a defensible standard begins with the fundamental physics of electricity. On the power grid, electrons from coal- and natural gas-fired power plants intermingle with those from wind and solar farms. There’s no way for companies hooking up to the grid to choose which electrons get delivered to their doors or opt out of certain resources. So if they want to reduce their carbon footprints, they can either decrease their energy consumption — by making their operations more efficient, say, or installing on-site solar panels — or they can turn to financial instruments such as renewable energy certificates, or RECs.
In general, a REC certifies that one megawatt-hour of clean power was generated, at some point, somewhere. The current Scope 2 Standard treats all RECs as interchangeable, but in reality, some RECs are far more effective than others at reducing emissions. The question now is how to improve the standard to account for these differences.
“There is no absolute truth,” Wilson Ricks, an engineering postdoctoral researcher at Princeton University and working group member, told me back in June. “I mean, there are more or less absolute truths about things like how much emissions are going into the atmosphere. But the system for how companies report a certain number, and what they’re able to claim about that number, is ultimately up to us.”
The current standard, finalized in 2015, instructs companies to report two numbers for their scope 2 emissions, based on two different methodologies. The formula for the first is straightforward: multiply the amount of electricity your facilities consume in a given year by the average emissions produced by the local power grids where you operate. This “location-based” number is a decent approximation of the carbon emitted as a result of the company’s actual energy use.
If the company buys RECs or similar market-based instruments, it can also calculate its “market-based” emissions. Under the 2015 standard, if a company consumed 100 megawatt-hours in a year and bought 100 megawatt-hours’ worth of certificates from a solar farm, it could report that its scope 2 emissions, under the market-based method, were zero. This is what enables companies to claim they “run on 100% renewable energy.”
RECs are fundamentally different from carbon offsets, in that they do not certify that any specific amount of emissions has been prevented. They can cut carbon indirectly by creating an additional revenue stream for renewable energy projects. But when a company buys RECs from a solar project in California, where the grid is saturated with solar, it will do less to reduce emissions than if it bought RECs from a solar project in Wyoming, where the grid is still largely powered by coal, or from a battery storage project in California, which can produce clean power at night.
There are other ways RECs can vary — for instance, companies can buy them directly from power producers by means of a long-term contract, or as one-off purchases on the spot market. Spot market REC purchases are generally less effective at displacing fossil fuels because they’re more likely to come from pre-existing wind and solar farms — sometimes ones that have been operating for years and would continue with or without REC sales. Long-term contracts, by contrast, can help get new clean energy projects financed because the guaranteed revenue helps developers secure financing. (There are exceptions to these rules, but these are broadly the dynamics.)
All this is to say that the current standard allows for two companies that consumed the same amount of power and bought the same number of RECs to report that they have “zero emissions,” even if one helped reduce emissions by a lot and the other did little to nothing. Almost everyone agrees the situation can be improved. The question is how.
The proposal set for public comment next month introduces more granularity to the rules around RECs. Instead of tallying up annual aggregate energy use, companies would have to tally it up by hour and location. To lower companies' scope 2 footprints further, purchased RECs will have to be generated within the same grid region as the company’s operations, and match a distinct hour of consumption. (This “hourly matching” approach may sound familiar to anyone who followed the fight over the green hydrogen tax credit rules.)
Proponents see this as a way to make companies’ claims more credible — businesses would no longer be able to say they were using solar power at night, or wind power generated in Texas to supply a factory in Maine. While companies would still not be literally consuming the power from the RECs they buy, it would at least be theoretically possible that they could be. “It’s really, in my view, taking how we do electricity accounting back to some fundamentals of how the power system itself works,” Killian Daly, executive director of the nonprofit EnergyTag, which advocates for hourly matching, told me.
The granularity camp also argues that these rules create better incentives. Today, companies mostly buy solar RECs because they’re cheap and abundant. But solar alone can’t get us to zero emissions electricity, Ricks told me. Hourly matching will force companies to consider signing contracts with energy storage and geothermal projects, for example, or reducing their energy use during times when there’s less clean energy available. “It incentivizes the actions and investments in the technologies and business practices that will be needed to actually finish the job of decarbonizing grids,” he said.
While the standard is technically voluntary, companies that object to the revision will likely be stuck with it, as governments in California and Europe have started to integrate the Greenhouse Gas Protocol’s methodologies into their mandatory corporate disclosure rules.
The proposal’s critics, however, contend that time and location matching will be so costly and difficult to implement that it may lead companies to simply stop buying clean energy. One analysis by the electricity data science nonprofit WattTime found that the draft revision could increase emissions compared to the status quo if it causes a decline in corporate clean power procurement. “We’re looking at a potentially really catastrophic failure of the renewable energy market,” Gavin McCormick, the co-founder and executive director of WattTime, told me.
Another concern is that companies with operations in multiple regions could shift from signing long-term contracts for RECs, often called power purchase agreements, to relying on the spot market. These contracts must be large to be beneficial for developers because negotiating multiple offtake agreements for a single renewable energy project increases costs and risk. Such deals may still make sense for big energy users like data centers, but a company like Starbucks, with cafes throughout the country, will have to start sourcing fewer RECs in more places to cover all the parts of the world where they operate.
The granularity fans assert that their proposal will not be as challenging or expensive as critics claim — and regardless, they argue, real decarbonization is difficult. It should be hard for companies to make bold claims like saying they are 100% clean, Daly told me. “We need to get to a place where companies can be celebrated for being like, I’m not 100% matched, but I will be in five years,” he said.
The proposal does include carve-outs allowing smaller companies to continue to use annual matching and for legacy clean energy contracts, even if they don’t meet hourly or location requirements. But critics like McCormick argue that the whole point of revising the standard is to help catalyze greater emission reductions. Less participation in the market would hurt that goal — but more than that, these accounting rules aren’t designed to measure emissions, let alone maximize real-world emission reductions. You could still have one company that spends the time and money to invest in scarce resources at odd hours and achieves 60% clean power, while another achieves the same proportion by continuing to buy abundant solar RECs. Both would still get to claim the same sustainability laurels.
The biggest corporate defender of time and location matching is Google. On the other side are tech giants Meta and Amazon, among others, arguing for an approach more explicitly focused on emissions. They want the Greenhouse Gas Protocol to endorse a different accounting scheme that measures the fossil fuel emissions displaced by a given clean energy purchase and allows companies to subtract that amount from their total scope 2 footprint — much more akin to the way carbon offsets work.
If done right, this method would recognize the difference between a solar REC in California and one in Wyoming. It would give companies more flexibility, potentially deploying capital to less developed parts of the world that need help to decarbonize. It could also, eventually, encourage investment in less mature and therefore more expensive resources, like energy storage and geothermal — although perhaps not until there’s solar panels on every corner of the globe.
This idea, too, is risky. Calculating the real-world emissions impact of a REC, which the scope 2 working group calls “consequential accounting” is an exercise in counterfactuals. It requires making assumptions about what the world would have looked like if the REC hadn’t been purchased, both in the near term and long term. Would the clean energy have been generated anyway?
McCormick, who is a proponent of this emissions-focused approach, argues that it’s possible to measure the counterfactual in the electricity market with greater certainty than with something like forestry carbon offsets. With electricity, he told me, “there's five minute-level data for almost every power plant in the world, as opposed to forests. If you're lucky, you measure some forests, once a year. It's like a factor of 10,000 times more data, so all the models are more accurate.”
Some granularity proponents, including Ricks, agree that consequential accounting is valuable and could have a place in corporate reporting, but worry that it’s ripe for abuse. “At the end of the day, you can't ever verify whether the system you're using to assign a given company a given number is right, because you can't observe that counterfactual world,” he said. “We need to be very cautious about how it’s designed, and also how companies actually report what they’re doing and what level of confidence is communicated.”
Both proposals are flawed, and both have potential to allow at least some companies to claim progress on paper while having little real-world impact. In some ways, the disagreement is more philosophical than scientific. What should this standard be trying to achieve? Should it be steering corporate dollars into clean energy, accuracy of claims be damned? Or should it be protecting companies from accusations of greenwashing? What impacts do we care about more, faster emissions reductions or strategic decarbonization?
“They’re actually not opposing views,” McCormick told me. “There’s these people making this point and there’s these people making this point. They’re running into each other, but they’re actually not saying opposite things.”
To Michael Gillenwater, executive director of the Greenhouse Gas Management Institute, a carbon accounting research and training nonprofit, people are attempting to hide policy questions within the logic and principles of accounting. “We’re asking the emissions inventories to do too much — to do more than they can — and therefore we end up with a mess,” he told me. Corporate disclosures serve many different purposes — helping investors assess risk, informing a company’s internal target setting and performance tracking, creating transparency for consumers. “A corporate inventory might be one little piece of that puzzle,” he said.
Gillenwater is among those that think the working group’s time- and location-matching proposal would stifle corporate investment in clean energy when the goal should be to foster it. But his preferred solution is to forget trying to come up with a single metric and to encourage companies to make multiple disclosures. Companies could publish their location-based greenhouse gas inventory and then use market-based accounting to make a separate “mitigation intervention statement.” To sum it up, Gillenwater said, “keep the emissions inventory clean.”
The risk there is that the public — or indeed anyone not deeply versed in these nuances — will not understand the difference. That’s why Brander, the Edinburgh professor, argues that regardless of how it all shakes out, the Greenhouse Gas Protocol itself needs to provide more explicit guidance on what these numbers mean and how companies are allowed to talk about them.
“At the moment, the current proposals don’t include any text on how to interpret the numbers,” he said. “It’s almost incredible, really, for an accounting standard to say, here’s a number, but we’re not going to tell you how to interpret it. It’s really problematic.”
All this pushback may prompt changes. After the upcoming comment period closes in late November or early December, the working group could decide to revise the proposal and send it out for public consultation again. The entire revision process isn’t estimated to be completed until the end of 2027 at the earliest.
With wind and solar tax credits scheduled to sunset around then, voluntary action by companies will take on even greater importance in shaping the clean energy transition. While in theory, the Greenhouse Gas Protocol solely develops accounting rules and does not force companies to take any particular action, it’s undeniable that its decisions will set the stage for the next chapter of decarbonization. That chapter could either be about solving for round-the-clock clean power, or just trying to keep corporate clean energy investment flowing and growing, hopefully with higher integrity.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
How America’s one-time leader in designing small modular nuclear reactors missed out on $800 million.
When Congress earmarked $800 million in the 2021 bipartisan infrastructure law to finance the deployment of the United States’ first small modular reactors, there was one obvious recipient lawmakers and industry alike had in mind: NuScale Power.
The Oregon-based company had honed its reactor to meet the 21st century nuclear industry’s needs. The design, completed in the years after the Fukushima disaster in Japan, rendered a similar meltdown virtually impossible. The output, equal to 50 megawatts of electricity, meant that developers would need to install the reactors in packs, which would hasten the rate of learning and bring down costs in much the same way assembly line repetition made solar, wind, and batteries cheap. In mid-2022, the Nuclear Regulatory Commission certified NuScale’s design, making the company’s reactor the first — and so far only — SMR to win federal approval. Seeing NuScale as its champion, the Department of Energy plowed at least $583 million into what was supposed to be the company’s first deployment. To slap an exclamation point on its preeminence, NuScale picked the ticker “SMR” when it went public on the New York Stock Exchange that year.
That September, I toured the shuttered Oyster Creek nuclear plant in New Jersey, where a very different kind of nuclear company, decommissioning specialist Holtec International, was considering building the first of its own as-yet-unapproved SMRs as part of an effort to get into the energy generation game. Holtec’s trajectory to becoming an active nuclear plant operator seemed all but certain, but a former employee cast serious doubts on whether it would end up producing its own reactors. “NuScale is at the front of the line right now,” the former Holtec employee told me at the time. “It’s more realistic to bet your horses on that.”
But forerunners are not always frontrunners. When the Energy Department finally awarded that $800 million earlier this month to two different reactor companies, neither one was NuScale.
Splitting the funding between two projects, the agency gave $400 million to build GE Vernova Hitachi Nuclear Energy’s 300-megawatt BWRX-300 reactor at the Tennessee Valley Authority’s Clinch River site, just south of Oak Ridge. The other $400 million went to Holtec to fund the expansion of the Palisades nuclear plant in western Michigan using the company’s own 300-megawatt SMR-300 reactor — the same one I saw it prepping for in New Jersey.
“I call it the eff NuScale award,” one industry source, who previously worked at NuScale and requested anonymity to speak frankly about the company, told me, using slightly more colorful language.
NuScale declined my request for an interview.
Spun out of research at Oregon State University and the Idaho National Laboratory in 2007, NuScale appeared at the peak of the last attempt at a nuclear renaissance, when the Bush administration planned to build dozens of new reactors to meet the country’s needs for clean electricity. That just two large reactors conceived at that time — the pair of gigawatt-sized Westinghouse AP1000s completed at Southern Company’s Alvin W. Vogtle Electric Generating Plant over the past two years — seemed to justify NuScale’s smaller approach.
Since America’s first commercial nuclear plant came online at Pennsylvania’s Shippingport plant in December 1957, reactors have been bespoke megaprojects, each designed to particular needs and geological conditions. Atomic energy projects regularly went over budget. In the 1960s and 1970s, when the majority of the nation’s 94 operating reactors were built, that didn’t matter. Utilities were vertically integrated monopolies that controlled the power plants, the distribution lines, and sales to ratepayers. Cost overruns on power stations were offset by profits in other divisions. As appliances such as dishwashers, washing machines, and air conditioners relieved the tedium of managing American households, electricity sales climbed and made billion-dollar nuclear projects manageable.
In the 1990s, however, the Clinton-era drive to end big government brought the market’s efficient logic to the electric grid, which was supposed to bring down rates by making power plants compete against each other. The practical effect was to render a years-long endeavor with steep upfront costs, such as building a nuclear plant, virtually impossible to justify in markets where gas plants, solar farms, and wind turbines could come online faster and cheaper. That those energy sources wouldn’t last as long or provide as much electricity as nuclear reactors did not enter into the calculus.
SMRs were supposed to solve that dilemma. The most common metaphor harkened to aerospace: Traditional nuclear plants were built to local specs, like airports, whereas SMRs would be built like airplanes rolling off the factory floor. A utility looking to generate a gigawatt of electricity could build one AP1000, or it could buy 20 of NuScale’s 50-megawatt units. Vogtle Unit 4, which came online last year, ended up costing 30% less than Vogtle Unit 3, the debut AP1000 that started up in 2023, since it could rely on the previous unit’s design and supply chain. If NuScale’s reactors followed the same trajectory, the cost savings by the time the 20th reactor came online would be stupendous.
But what works on paper doesn’t always pan out in concrete. In November 2023, less than three months after Vogtle Unit 3 entered into service, NuScale’s first project — a half-dozen of reactors near the Idaho National Laboratory, meant to sell electricity to a network of municipal power companies in Utah — collapsed as inflation ballooned costs.
The company seemingly hasn’t been able to catch a break since then. Last year, the U.S. Export-Import Bank approved a loan to fund construction of a NuScale project in Romania; in August, the company announced that a final investment decision on the plant near Bucharest could be delayed until 2027. Over the summer, a project developer in Idaho floated the idea of building NuScale reactors at the site of a giant wind farm the Trump administration canceled. But NuScale denied the effort in an email to me at the time, and nothing has yet come of it.
The company has lately shown some green shoots, however. The NRC approved an upgrade to NuScale’s design in July, raising the output to 77 megawatts to make the reactor roughly 50% more powerful. In September, NuScale’s exclusive development partner, Entra1, inked a deal with the TVA to build up to six of its reactors at one of the federal utility’s sites in southeastern Tennessee.
“It’s too early to discount NuScale,” Chris Gadomski, the lead nuclear analyst at the consultancy BloombergNEF, told me.
But the TVA project was also too early-stage for the Energy Department to make a bet, experts told me.
“This isn’t necessarily the government picking winners here as much as the market is supporting projects at these two sites, at least pending government approval,” Adam Stein, the director of nuclear energy innovation at the think tank Breakthrough Institute, said. “The government is supporting projects the market has already considered.”
By contrast, GE-Hitachi’s Clinch River project has been in the works for nearly four years. The BWRX-300 has other advantages. GE-Hitachi — a joint venture between the American energy-equipment giant GE Vernova and the Japanese industrial behemoth Hitachi — has decades of experience in the nuclear space. Indeed, a third of the reactors in the U.S. fleet are boiling water reactors, the design GE pioneered in the mid-20th century and updated as an SMR with the BWRX-300. Making the technology more appealing is the fact that Ontario Power Generation is building the first BWRX-300, meaning that the state-owned utility in Canada’s most populous province can work out the kinks and allow for the TVA’s project to piggyback off the lessons learned.
While Holtec may be a newcomer to nuclear generation, the company has manufactured specialized containers to store spent reactor fuel for more than three decades, giving it experience in nuclear projects. Holtec is also close to bringing the single reactor at the Palisades plant back online, which will be the first time a nuclear plant returns to regular operation in the U.S. Like NuScale’s, Holtec’s SMR is based on the pressurized water reactor design that makes up nearly 70% of the U.S. fleet.
The point is, both companies have existing nuclear businesses that lay the groundwork for becoming SMR vendors. “GE is a nuclear fuel and services business and Holtec is a nuclear waste services and decommissioning business. That’s what they live on,” the former NuScale employee told me. “NuScale lives on the thoughts, prayers, and good graces of investors.”
Shares of NuScale today trade at roughly double the price of its initial public offering, which is at least in part a reflection of the feverish stock surges for SMR companies over the past year. The artificial intelligence boom has spurred intense excitement on Wall Street for nuclear power, but many of the established companies in the industry are not publicly traded — Westinghouse, GE-Hitachi, and Holtec are all privately held. That could be an advantage. Last month, the prices of most major SMR companies plunged in what the journalist Robert Bryce said indicates the “hype over SMRs is colliding with the realities of the marketplace.” NuScale saw the steepest drop.
But Brett Rampal, a nuclear analyst at the consultancy Veriten, said NuScale’s “current focus around its relationship with Entra1” could make the company more nimble than its rivals because it can “pursue potential projects absent a direct utility customer, like GE, or owning the asset themselves, like Holtec.”
One factor the market isn’t apparently considering yet: whether the type of SMR NuScale, GE-Hitachi, and Holtec are designing actually pencil out.
The Energy Department’s funding was designed for third-generation SMRs, meaning shrunk-down, less powerful versions of light water reactors, an umbrella category that includes both boiling and pressurized water reactors. The option to go smaller existed in the heyday of nuclear construction in the 1970s, but developers at that time found that larger reactors delivered economies of scale that made more financial sense. Neither Russia, the world’s top nuclear exporter and the only country to deploy an SMR so far, nor China, the nation building the most new atomic power plants by far, including an SMR, has filled its order books with smaller reactors. Instead, the leading Chinese design is actually a bigger, more powerful version of the AP1000.
Calculations from the Massachusetts Institute of Technology estimate that the first BWRX-300 will cost significantly more than another AP1000, given that the GE-Hitachi model has yet to be built and the Westinghouse reactor has an established design and supply chain. That reality has propelled growing interest in building large-scale reactors again in the U.S. In October, the Department of Commerce brokered a landmark deal to spend $80 billion on 10 new AP1000s. This week, Westinghouse’s majority owner Brookfield inked a deal to complete construction on the aborted VC Summer AP1000 project in South Carolina.
At the same time, the Energy Department has kicked off a pilot program designed to hasten deployment of fourth-generation reactors, the type of technology that uses coolants other than water. Bill Gates’ molten salt-cooled reactor company, TerraPower, just cleared its final safety hurdle at the NRC for its so-called Natrium reactor, setting the stage to potentially build the nation’s first commercial fourth-generation nuclear plant in Wyoming.
“From a marketing point of view, everyone has consistently said that light water reactor SMRs will be the fastest to market,” Stein said. But the way things are going, both NuScale and its peers could get lapped yet again.
Citrine Informatics has been applying machine learning to materials discovery for years. Now more advanced models are giving the tech a big boost.
When ChatGPT launched three years ago, it became abundantly clear that the power of generative artificial intelligence had the capacity to extend far beyond clever chatbots. Companies raised huge amounts of funding based on the idea that this new, more powerful AI could solve fundamental problems in science and medicine — design new proteins, discover breakthrough drugs, or invent new battery chemistries.
Citrine Informatics, however, has largely kept its head down. The startup was founded long before the AI boom, back in 2013, with the intention of using simple old machine learning to speed up the development of more advanced, sustainable materials. These days Citrine is doing the same thing, but with neural networks and transformers, the architecture that undergirds the generative AI revolution.
“The technology transition we’re going through right now is pretty massive,” Greg Mulholland, Citrine’s founder and CEO, told me. “But the core underlying goal of the company is still the same: help scientists identify the experiments that will get them to their material outcome as fast as possible.”
Rather than developing its own novel materials, Citrine operates on a software-as-a-service model, selling its platform to companies including Rolls-Royce, EMD Electronics, and chemicals giant LyondellBassell. While a SaaS product may be less glamorous than independently discovering a breakthrough compound that enables something like a room-temperature superconductor or an ultra-high-density battery, Citrine’s approach has already surfaced commercially relevant materials across a variety of sectors, while the boldest promises of generative AI for science remain distant dreams.
“You can think of it as science versus engineering,” Mulholland told me. “A lot of science is being done. Citrine is definitely the best in kind of taking it to the engineering level and coming to a product outcome rather than a scientific discovery.” Citrine has helped to develop everything from bio-based lotion ingredients to replace petrochemical-derived ones, to plastic-free detergents, to more sustainable fire-resistant home insulation, to PFAS-free food packaging, to UV-resistant paints.
On Wednesday, the company unveiled two new platform capabilities that it says will take its approach to the next level. The first is essentially an advanced LLM-powered filing system that organizes and structures unwieldy materials and chemicals datasets from across a company. The second is an AI framework informed by an extensive repository of chemistry, physics, and materials knowledge. It can ingest a company’s existing data, and, even if the overall volume is small, use it to create a list of hundreds of potential new materials optimized for factors such as sustainability, durability, weight, manufacturability, or whatever other outcomes the company is targeting.
The platform is neither purely generative nor purely predictive. Instead, Mulholland explained, companies can choose to use Citrine’s tools “in a more generative mode” if they want to explore broadly and open up the field of possible materials discoveries, or in a more “optimized” mode that stays narrowly focused on the parameters they set. “What we find is you need a healthy blend of the two,” he told me.
The novel compounds the model spits out still need to be synthesized and tested by humans. “What I tell people is, any plane made of materials designed exclusively by Citrine and never tested is not a plane I’m getting on,” Mulholland told me. The goal isn’t to achieve perfection right out of the lab, but rather to optimize the experiments companies end up having to do. “We still need to prove materials in the real world, because the real world will complicate it.”
Indeed it will. For one thing, while AI is capable of churning out millions of hypothetical materials — as a tool developed by Google DeepMind did in 2023 — materials scientists have since shown that many are just variants of known compounds, while others are unstable, unable to be synthesized, or otherwise irrelevant under real world conditions.
Such failures likely stem, in part, from another common limitation of AI models trained solely on publicly available materials and chemicals data: Academic research tends to report only successful outcomes, omitting data on what didn’t work and which compounds weren’t viable. That can lead models to be overly optimistic about the magnitude and potential of possible materials solutions and generate unrealistic “discoveries” that may have already been tested and rejected.
Because Citrine’s platform is deployed within customer organizations, it can largely sidestep this problem by tuning its model on niche, proprietary datasets. These datasets are small when compared with the vast public repositories used to train Citrine’s base model, but the granular information they contain about prior experiments — both successes and failures — has proven critical to bringing new discoveries to market.
While the holy grail for materials science may be a model trained on all the world’s relevant data — public and private, positive and negative — at this point that’s just a fantasy, one of Citrine’s investors, Mark Cupta of Prelude Ventures, told me over email. “It’s hard to get buy-in from the entire material development world to make an open-source model that pulls in data from across the field.”
Citrine’s last raise, which Prelude co-led, came at the very beginning of 2023, as the AI wave was still gathering momentum. But Mulholland said there’s no rush to raise additional capital — in fact, he expects Citrine to turn a profit in the next year or so.
That milestone would strongly validate the company’s strategy, which banks on steady revenue from its subscription-based model to compensate for the fact that it doesn’t own the intellectual property for the materials it helps develop. While Mulholland told me that many players in this space are trying to “invent new materials and patent them and try to sell them like drugs,” Citrine is able to “invent things much more quickly, in a more realistic way than the pie in the sky, hoping for a Nobel Prize [approach].”
Citrine is also careful to assure that its model accounts for real world constraints such as regulations and production bottlenecks. Say a materials company is creating an aluminum alloy for an automaker, Mulholland explained — it might be critical to stay within certain elemental bounds. If the company were to add in novel elements, the automaker would likely want to put its new compound through a rigorous testing process, which would be annoying if it’s looking to get to market as quickly as possible. Better, perhaps, to tinker around the edges of what’s well understood.
In fact, Mulholland told me it’s often these marginal improvements that initially bring customers into the fold, convincing them that this whole AI-for-materials thing is more than just hype. “The first project is almost always like, make the adhesive a little bit stickier — because that’s a good way to prove to these skeptical scientists that AI is real and here to stay,” he said. “And then they use that as justification to invest further and further back in their product development pipeline, such that their whole product portfolio can be optimized by AI.”
Overall, the company says that its new framework can speed up materials development by 80%. So while Mulholland and Citrine overall may not be going for the Nobel in Chemistry, don’t doubt for a second that they’re trying to lead a fundamental shift in the way consumer products are designed.
“I’m as bullish as I can possibly be on AI in science,” Mulholland told me. “It is the most exciting time to be a scientist since Newton. But I think that the gap between scientific discovery and realized business is much larger than a lot of AI folks think.”
Plus more insights from Heatmap’s latest event Washington, D.C.
At Heatmap’s event, “Supercharging the Grid,” two members of the House of Representatives — a California Democrat and a Colorado Republican — talked about their shared political fight to loosen implementation of the National Environmental Policy Act to accelerate energy deployment.
Representatives Gabe Evans and Scott Peters spoke with Heatmap’s Robinson Meyer at the Washington, D.C., gathering about how permitting reform is faring in Congress.
“The game in the 1970s was to stop things, but if you’re a climate activist now, the game is to build things,” said Peters, who worked as an environmental lawyer for many years. “My proposal is, get out of the way of everything and we win. Renewables win. And NEPA is a big delay.”
NEPA requires that the federal government review the environmental implications of its actions before finalizing them, permitting decisions included. The 50-year-old environmental law has already undergone several rounds of reform, including efforts under both Presidents Biden and Trump to remove redundancies and reduce the size and scope of environmental analyses conducted under the law. But bottlenecks remain — completing the highest level of review under the law still takes four-and-a-half years, on average. Just before Thanksgiving, the House Committee on Natural Resources advanced the SPEED Act, which aims to ease that congestion by creating shortcuts for environmental reviews, limiting judicial review of the final assessments, and preventing current and future presidents from arbitrarily rescinding permits, subject to certain exceptions.
Evans framed the problem in terms of keeping up with countries like China on building energy infrastructure. “I’ve seen how other parts of the world produce energy, produce other things,” said Evans. “We build things cleaner and more responsibly here than really anywhere else on the planet.”
Both representatives agreed that the SPEED Act on its own wouldn’t solve all the United States’ energy issues. Peters hinted at other permitting legislation in the works.
“We want to take that SPEED Act on the NEPA reform and marry it with specific energy reforms, including transmission,” said Peters.
Next, Neil Chatterjee, a former Commissioner of the Federal Energy Regulatory Commission, explained to Rob another regulatory change that could affect the pace of energy infrastructure buildout: a directive from the Department of Energy to FERC to come up with better ways of connecting large new sources of electricity demand — i.e. data centers — to the grid.
“This issue is all about data centers and AI, but it goes beyond data centers and AI,” said Chatterjee. “It deals with all of the pressures that we are seeing in terms of demand from the grid from cloud computing and quantum computing, streaming services, crypto and Bitcoin mining, reshoring of manufacturing, vehicle electrification, building electrification, semiconductor manufacturing.”
Chatterjee argued that navigating load growth to support AI data centers should be a bipartisan issue. He expressed hope that AI could help bridge the partisan divide.
“We have become mired in this politics of, if you’re for fossil fuels, you are of the political right. If you’re for clean energy and climate solutions, you’re the political left,” he said. “I think AI is going to be the thing that busts us out of it.”
Updating and upgrading the grid to accommodate data centers has grown more urgent in the face of drastically rising electricity demand projections.
Marsden Hanna, Google’s head of energy and dust policy, told Heatmap’s Jillian Goodman that the company is eyeing transmission technology to connect its own data centers to the grid faster.
“We looked at advanced transition technologies, high performance conductors,” said Hanna. “We see that really as just an incredibly rapid, no-brainer opportunity.”
Advanced transmission technologies, otherwise known as ATTs, could help expand the existing grid’s capacity, freeing up space for some of the load growth that economy-wide electrification and data centers would require. Building new transmission lines, however, requires permits — the central issue that panelists kept returning to throughout the event.
Devin Hartman, director of energy and environmental policy at the R Street Institute, told Jillian that investors are nervous that already-approved permits could be revoked — something the solar industry has struggled with under the Trump administration.
“Half the battle now is not just getting the permits on time and getting projects to break ground,” said Hartman. “It’s also permitting permanence.”
This event was made possible by the American Council on Renewable Energy’s Macro Grid Initiative.