You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
According to IPCC author Andy Reisinger, “net zero by 2050” misses some key points.

Tackling climate change is a complex puzzle. Hitting internationally agreed upon targets to limit warming requires the world to reduce multiple types of greenhouse gases from a multiplicity of sources on diverse timelines and across varying levels of responsibility and control by individual, corporate, and state actors. It’s no surprise the catchphrase “net zero by 2050” has taken off.
Various initiatives have sprung up to distill this complexity for businesses and governments who want to do (or say they are doing) what the “science says” is necessary. The nonprofit Science Based Targets initiative, for example, develops standard roadmaps for companies to follow to act “in line with climate science.” The groups also vets corporate plans and deems them to either be “science based” or not. Though entirely voluntary, SBTi’s approval has become a nearly mandatory mark of credibility. The group has validated the plans of more than 5,500 companies with more than $46 trillion in market capitalization — nearly half of the global economy.
But in a commentary published in the journal Nature last week, a group of Intergovernmental Panel on Climate Change experts argue that SBTi and other supposedly “science based” target-setting efforts misconstrue the science and are laden with value judgments. By striving to create straightforward, universal rules, they flatten more nuanced considerations of which emissions must be reduced, by whom and by when.
“We are arguing that those companies and countries that are best resourced, have the highest capacity to act, and have the highest responsibility for historical emissions, probably need to go a lot further than the global average,” Andy Reisinger, the lead author of the piece, told me.
In response to the paper, SBTi told me it “welcomes debate,” and that “robust debate is essential to accelerate corporate ambition and climate action.” The group is currently in the process of reviewing its Net-Zero Standard and remains “committed to refining our approaches to ensure they are effective in helping corporates to drive the urgent emissions reductions needed to combat the climate crisis.”
The commentary comes as SBTi’s reputation is already on shaky ground. In April, its board appeared to go rogue and said that the group would loosen its standards for the use of carbon offsets. The announcement was met first with surprise and later with fierce protest from the nonprofit’s staff and technical council, who had not been consulted. Environmental groups accused SBTi of taking the “science” out of its targets. The board later walked back its statement, saying that no change had been made to the rules, yet.
But interestingly enough, the new Nature commentary argues that SBTi’s board was actually on the right track. I spoke to Reisinger about this, and some of the other ways he thinks science based targets “miss the mark.”
Reisinger, who’s from New Zealand, was the vice-chair of the United Nations Intergovernmental Panel on Climate Change’s mega-report on climate mitigation from 2022. I caught him just as he had arrived in Sofia, Bulgaria, for a plenary that will determine the timeline for the next big batch of UN science reports. Our conversation has been edited for length and clarity.
Was there something in particular that inspired you to write this? Or were you just noticing the same issues over and over again?
There were probably several things. One is a confusion that’s quite prevalent between net zero CO2 emissions and net zero greenhouse gas emissions. The IPCC makes clear that to limit warming at any level, you need to reach net zero CO2 emissions, because it’s a long lived greenhouse gas and the warming effect accumulates in the atmosphere over time. You need deep reductions of shorter lived greenhouse gases like methane, but they don’t necessarily have to reach zero. And yet, a lot of people claim that the IPCC tells us that we have to reach net zero greenhouse gas emissions by 2050, which is simply not the case.
Of course, you can claim that there’s nothing wrong, surely, with going to net zero greenhouse gas emissions because that’s more ambitious. But there’s two problems with that. One is, if you want to use science, you have to get the science correct. You can’t just make it up and still claim to be science-based. Secondly, it creates a very uneven playing field between those who mainly have CO2 emissions and those who have non-CO2 emissions as a significant part of their emissions portfolio — which often are much harder to reduce.
Can you give an example of what you mean by that?
You can rapidly decarbonize and actually approach close to zero emissions in your energy generation, if that’s your dominant source of emissions. There are viable solutions to generate energy with very low or no emissions — renewables, predominantly. Nuclear in some circumstances.
But to give you another example, in Australia, the Meat and Livestock Association, they set a net zero target, but they subsequently realized it’s much harder to achieve it because methane emissions from livestock are very, very difficult to reduce entirely. Of course you can say, we’ll no longer produce beef. But if you’re the Cattle Association, you’re not going to rapidly morph into producing a different type of meat product. And so in that case, achieving net zero is much more challenging. Of course, you can’t lean back and say, Oh, it’s too difficult for us, therefore we shouldn’t try.
I want to walk through the three main points to your argument for why science-based targets “miss the mark.” I think we’ve just covered the first. The second is that these initiatives put everyone on the same timeline and subject them to the same rules, which you say could actually slow emissions reductions in the near term. Can you explain that?
The Science Based Targets initiative in particular, but also other initiatives that provide benchmarks for companies, tend to want to limit the use of offsets, where a company finances emission reductions elsewhere and claims them to achieve their own targets. And there’s very good reasons for that, because there’s a lot of greenwashing going on. Some offsets have very low integrity.
At the same time, if you set a universal rule that all offsets are bad and unscientific, you’re making a major mistake. Offsets are a way of generating financial flows towards those with less intrinsic capacity to reduce their emissions. So by making companies focus only on their own reductions, you basically cut off financial flows that could stimulate emission reductions elsewhere or generate carbon dioxide removals. Then you’re creating a problem for later on in the future, when we desperately need more carbon dioxide removal and haven’t built up the infrastructure or the accountability systems that would allow that.
As you know, there’s a lot of controversy about this right now. There are many scientists who disagree with you and don’t want the Science Based Targets initiative to loosen its rules for using offsets. Why is there this split in the scientific community about this?
I think the issue arises when you think that net zero by 2050 is the unquestioned target. But if you challenge yourself to say, well net zero by 2050 might be entirely unambitious for you, you have to reduce your own emissions and invest in offsets to go far beyond net zero by 2050 — then you might get a different reaction to it.
I think everybody would agree that if offsets are being used instead of efforts to reduce emissions that are under a company’s direct control, and they can be reduced, then offsets are a really bad idea. And of course, low integrity offsets are always a bad idea. But the solution to the risk of low integrity cannot be to walk away from it entirely, because otherwise you’ve further reduced incentives to actually generate accountability mechanisms. So the challenge would be to drive emission reductions at the company level, and on top of that, create incentives to engage in offsets, to increase financial flows to carbon dioxide removal — both permanent and inherently non permanent — because we will need it.
My understanding is that groups like SBTi and some of these other carbon market integrity initiatives agree with what you’ve just said — even if they don’t support offsetting emissions, they do support buying carbon credits to go above and beyond emissions targets. They are already advocating for that, even if they’re not necessarily creating the incentives for it.
I mean, that’s certainly a move in the right direction. But it’s creating this artificial distinction between what the science tells you, the “science based target,” and then the voluntary effort beyond that. Whereas I think it has to become an obligation. So it’s not a distinction between, here’s what the science says, and here’s where your voluntary, generous, additional contribution to global efforts might go. It is a much more integrated package of actions.
I think we’re starting to get at the third point that your commentary makes, which is about how these so-called science-based targets are inequitable. How does that work?
There’s a rich literature on differentiating targets at the country level based on responsibility for warming, or a capacity-based approach that says, if you’re rich and we have a global problem, you have to use your wealth to help solve the global problem. Most countries don’t because the more developed you are, the more unpleasant the consequences are.
At the company level, SBTi, for example, tends to use the global or regional or sectoral average rate of reductions as the benchmark that an individual company has to follow. But not every company is average, and systems transitions follow far more complex dynamics. Some incumbents have to reduce emissions much more rapidly, or they go out of business in order to create space for innovators to come in, whose emissions might rise in the near term before they go down, but with new technologies that allow deeper reductions in the long term. Assuming a uniform rate of reduction levels out all those differences.
It’s far more challenging to translate equity into meaningful metrics at the company level. But our core argument is, just because it’s hard, that cannot mean let’s not do it. So how can we challenge companies to disclose their thinking, their justification about what is good enough?
The Science Based Targets initiative formed because previously, companies were coming up with their own interpretations of the science, and there was no easy way to assess whether these plans were legitimate. Can you really imagine a middle ground where there is still some sort of policing mechanism to say whether a given corporate target is good enough?
That’s what we try to sketch as a vision, but it certainly won’t be easy. I also want to emphasize that we’re not trying to attack SBTi in principle. It’s done a world of good. And we certainly don’t want to throw the baby out with the bathwater to just cancel the idea. It’s more to use it as a starting point. As we say in our paper, you can almost take an SBTi target as the definition of what is not sufficient if you’re a company located in the Global North or a multinational company with high access to resources — human, technology and financial.
It was a wild west before SBTi and we’re not saying let’s go back to the wild west. We’re saying the pendulum might have swung too far to a universal rule that applies to everybody, but therefore applies to nobody.
There’s one especially scathing line in this commentary. You write that these generic rules “result in a pseudo-club that inadequately challenges its self-selected members while setting prohibitive expectations for those with less than average capacity.” We’ve already talked about the second half of this statement, but what do you mean by pseudo-club?
You write a science based target as a badge of achievement, a badge of honor on your company profile, assuming that therefore you have done all that can be expected of you when it comes to climate change. Most of the companies that have adopted science based targets are located in the Global North, or operate on a multinational basis and have therefore quite similar capacity. If that’s what we’re achieving — and then there’s a large number of companies that can’t possibly, under their current capacity, set science-based targets because they simply don’t have the resources — then collectively, we will fail. Science cannot tell you whether you have done as much as you could be doing. If we let the simplistic rules dominate the conversation, then we’re not going to be as ambitious as we need to be.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
How will America’s largest grid deal with the influx of electricity demand? It has until the end of the year to figure things out.
As America’s largest electricity market was deliberating over how to reform the interconnection of data centers, its independent market monitor threw a regulatory grenade into the mix. Just before the Thanksgiving holiday, the monitor filed a complaint with federal regulators saying that PJM Interconnection, which spans from Washington, D.C. to Ohio, should simply stop connecting new large data centers that it doesn’t have the capacity to serve reliably.
The complaint is just the latest development in a months-long debate involving the electricity market, power producers, utilities, elected officials, environmental activists, and consumer advocates over how to connect the deluge data centers in PJM’s 13-state territory without further increasing consumer electricity prices.
The system has been pushed into crisis by skyrocketing capacity auction prices, in which generators get paid to ensure they’re available when demand spikes. Those capacity auction prices have been fueled by high-octane demand projections, with PJM’s summer peak forecasted to jump from 154 gigawatts to 210 gigawatts in a decade. The 2034-35 forecast jumped 17% in just a year.
Over the past two two capacity auctions, actual and forecast data center growth has been responsible for over $16.6 billion in new costs, according to PJM’s independent market monitor; by contrast, the previous year’s auction generated a mere $2.2 billion. This has translated directly to higher retail electricity prices, including 20% increases in some parts of PJM’s territory, like New Jersey. It has also generated concerns about reliability of the whole system.
PJM wants to reform how data centers interconnect before the next capacity auction in June, but its members committee was unable to come to an agreement on a recommendation to PJM’s board during a November meeting. There were a dozen proposals, including one from the monitor; like all the others, it failed to garner the necessary two-thirds majority vote to be adopted formally.
So the monitor took its ideas straight to the top.
The market monitor’s complaint to the Federal Energy Regulatory Commission tracks closely with its plan at the November meeting. “PJM is currently proposing to allow the interconnection of large new data center loads that it cannot serve reliably and that will require load curtailments (black outs) of the data centers or of other customers at times. That result is not consistent with the basic responsibility of PJM to maintain a reliable grid and is therefore not just and reasonable,” the filing said. “Interconnecting large new data center loads when adequate capacity is not available is not providing reliable service.”
A PJM spokesperson told me, “We are still reviewing the complaint and will reserve comment at this time.”
But can its board still get a plan to FERC and avoid another blowout capacity auction?
“PJM is going to make a filing in December, no matter what. They have to get these rules in place to get to that next capacity auction in June,” Jon Gordon, policy director at Advanced Energy United, told me. “That’s what this has been about from the get-go. Nothing is going to stop PJM from filling something.”
The PJM spokesperson confirmed to me that “the board intends to act on large load additions to the system and is expected to provide an indication of its next steps over the next few weeks.” But especially after the membership’s failure to make a unified recommendation, what that proposal will be remains unclear. That has been a source of agita for the organizations’ many stakeholders.
“The absence of an affirmative advisory recommendation from the Members Committee creates uncertainty as to what reforms PJM’s Board of Managers may submit to the Federal Energy Regulatory Commission (FERC), and when stakeholders can expect that submission,” analysts at ClearView Energy Partners wrote in a note to clients. In spite of PJM’s commitments, they warned that the process could “slip into January,” which would give FERC just enough time to process the submission before the next capacity auction.
One idea did attract a majority vote from PJM’s membership: Southern Maryland Electric Cooperative’s, which largely echoed the PJM board’s own plan with some amendments. That suggestion called for a “Price Responsive Demand” system, in which electricity customers would agree to reduce their usage when wholesale prices spike. The system would be voluntary, unlike an earlier PJM proposal, which foresaw forcing large customers to curtail their power. “The load elects to not take on a capacity obligation, therefore does not pay for capacity, and is required to reduce demand during stressed system conditions,” PJM explained in an update. The Southern Maryland plan tweaks the PRD system to adjust its pricing mechanism. but largely aligns with what PJM’s staff put forward.
“There’s almost no real difference between the PJM proposal and that Southern Maryland proposal,” Gordon told me.
That might please restive stakeholders, or at least be something PJM’s board could go forward with knowing that the balance of its voting membership agreed with something similar.
“We maintain our view that a final proposal could resemble the proposed solution package from PJM staff,” the ClearView note said. “We also think the Board could propose reforms to PJM’s PRD program. Indeed, as noted above, SMECO’s revisions to the service gained majority support.”
The PJM plan also included relatively uncontroversial reforms to load forecasting to cut down on duplicated requests and better share information, and an “expedited interconnection track” on which new, large-scale generation could be fast-tracked if it were signed off on by a state government “to expedite consideration of permitting and siting.”
Gordon said that the market monitor’s complaint could be read as the organization “desperately trying to get FERC to weigh in” on its side, even if PJM is more likely to go with something like its own staff-authored submission.
“The key aspect of the market monitor’s proposal was that PJM should not allow a data center to interconnect until there was enough generation to supply them,” Gordon explained. During the meeting preceding the vote, “PJM said they didn’t think they had the authority to deny someone interconnection.”
This dispute over whether the electricity system has an obligation to serve all customers has been the existential question making the debate about how to serve data centers extra angsty.
But PJM looks to be trying to sidestep that big question and nibble around the edges of reform.
“Everybody is really conflicted here,” Gordon told me. “They’re all about protecting consumers. They don’t want to see any more increases, obviously, and they want to keep the lights on. Of course, they also want data center developers in their states. It’s really hard to have all three.”
Atomic Canyon is set to announce the deal with the International Atomic Energy Agency.
Two years ago, Trey Lauderdale asked not what nuclear power could do for artificial intelligence, but what artificial intelligence could do for nuclear power.
The value of atomic power stations to provide the constant, zero-carbon electricity many data centers demand was well understood. What large language models could do to make building and operating reactors easier was less obvious. His startup, Atomic Canyon, made a first attempt at answering that by creating a program that could make the mountains of paper documents at the Diablo Canyon nuclear plant, California’s only remaining station, searchable. But Lauderdale was thinking bigger.
In September, Atomic Canyon inked a deal with the Idaho National Laboratory to start devising industry standards to test the capacity of AI software for nuclear projects, in much the same way each update to ChatGPT or Perplexity is benchmarked by the program’s ability to complete bar exams or medical tests. Now, the company’s effort is going global.
On Wednesday, Atomic Canyon is set to announce a partnership with the United Nations International Atomic Energy Agency to begin cataloging the United Nations nuclear watchdog’s data and laying the groundwork for global standards of how AI software can be used in the industry.
“We’re going to start building proof of concepts and models together, and we’re going to build a framework of what the opportunities and use cases are for AI,” Lauderdale, Atomic Canyon’s chief executive, told me on a call from his hotel room in Vienna, Austria, where the IAEA is headquartered.
The memorandum of understanding between the company and the UN agency is at an early stage, so it’s as yet unclear what international standards or guidelines could look like.
In the U.S., Atomic Canyon began making inroads earlier this year with a project backed by the Institute of Nuclear Power Operators, the Nuclear Energy Institute, and the Electric Power Research Institute to create a virtual assistant for nuclear workers.
Atomic Canyon isn’t the only company applying AI to nuclear power. Last month, nuclear giant Westinghouse unveiled new software it’s designing with Google to calculate ways to bring down the cost of key components in reactors by millions of dollars. The Nuclear Company, a startup developer that’s aiming to build fleets of reactors based on existing designs, announced a deal with the software behemoth Palantir to craft the software equivalent of what the companies described as an “Iron Man suit,” able to swiftly pull up regulatory and blueprint details for the engineers tasked with building new atomic power stations.
Lauderdale doesn’t see that as competition.
“All of that, I view as complementary,” he said.
“There is so much wood to chop in the nuclear power space, the amount of work from an administrative perspective regarding every inch of the nuclear supply chain, from how we design reactors to how we license reactors, how we regulate to how we do environmental reviews, how we construct them to how we maintain,” he added. “Every aspect of the nuclear power life cycle is going to be transformed. There’s no way one company alone could come in and say, we have a magical approach. We’re going to need multiple players.”
That Atomic Canyon is making inroads at the IAEA has the potential to significantly broaden the company’s reach. Unlike other energy sources, nuclear power is uniquely subject to international oversight as part of global efforts to prevent civilian atomic energy from bleeding over into weapons production.
The IAEA’s bylaws award particular agenda-setting powers to whatever country has the largest fleet of nuclear reactors. In the nearly seven decades since the agency’s founding, that nation has been the U.S. As such, the 30 other countries with nuclear power have largely aligned their regulations and approaches to the ones standardized in Washington. When the U.S. artificially capped the enrichment levels of traditional reactor fuel at 5%, for example, the rest of the world followed.
That could soon change, however, as China’s breakneck deployment of new reactors looks poised to vault the country ahead of the U.S. sometime in the next decade. It wouldn’t just be a symbolic milestone. China’s emergence as the world’s preeminent nuclear-powered nation would likely come with Beijing’s increased influence over other countries’ atomic energy programs. As it is, China is preparing to start exporting its reactors overseas.
The role electricity demand from the data centers powering the AI boom has played in spurring calls for new reactors is undeniable. But if AI turns out to have as big an impact on nuclear operations as Lauderdale predicts, an American company helping to establish the global guidelines could help cement U.S. influence over a potentially major new factor in how the industry works for years, if not decades to come.
Current conditions: The Northeastern U.S. is bracing for 6 inches of snow, including potential showers in New York City today • A broad swath of the Mountain West, from Montana through Colorado down to New Mexico, is expecting up to six inches of snow • After routinely breaking temperature records for the past three years, Guyana shattered its December high with thermometers crossing 92 degrees Fahrenheit.
The Department of Energy gave a combined $800 million to two projects to build what could be the United States’ first commercial small modular reactors. The first $400 million went to the federally owned Tennessee Valley Authority to finance construction of the country’s first BWRX-300. The project, which Heatmap’s Matthew Zeitlin called the TVA’s “big swing at small nuclear,” is meant to follow on the debut deployment of GE-Hitachi Nuclear Energy’s 300-megawatt SMR at the Darlington nuclear plant in Ontario. The second $400 million grant backed Holtec International’s plan to expand the Palisades nuclear plant in Michigan where it’s currently working to restart with the company’s own 300-megawatt reactor. The funding came from a pot of money earmarked for third-generation reactors, the type that hew closely to the large light water reactors that make up nearly all the U.S. fleet of 94 commercial nuclear reactors. While their similarities with existing plants offer some benefits, the Trump administration has also heavily invested in incentives to spur construction of fourth-generation reactors that use coolants other than water. “Advanced light-water SMRs will give our nation the reliable, round-the-clock power we need to fuel the President’s manufacturing boom, support data centers and AI growth, and reinforce a stronger, more secure electric grid,” Secretary of Energy Chris Wright said in a statement. “These awards ensure we can deploy these reactors as soon as possible.”
You know who also wants to see more investment in SMRs? Arizona senator and rumored Democratic presidential hopeful Ruben Gallego, who released an energy plan Wednesday calling on the Energy Department to ease the “regulatory, scaling, and supply chain challenges” new reactors still face.
Since he first emerged on the political scene a decade ago, President Donald Trump has made the proverbial forgotten coal miner a central theme of his anti-establishment campaigns, vowing to correct for urbanite elites’ neglect by putting workers’ concerns at the forefront. Yet his administration is now considering overhauling black lung protections that miners lobbied federal agencies to enact and enforce. Secretary of Labor Lori Chavez-DeRemer will “reconsider and seek comments” on parts of the Biden-era silica rule that mining companies and trade groups are challenging in court, the agency told E&E News. It’s unclear how the Trump administration may seek to alter the regulation. But the rule, finalized last year, reduced exposure limits for miners to airborne silica crystals that lodge deep inside lung tissue to 50 micrograms from the previous 100 microgram limit. The rule also required companies to provide expanded medical tests to workers. Dozens of miners and medical advocates protested outside the agency’s headquarters in Washington in October to request that the rule, expected to prevent more than 1,000 deaths and 3,700 cases of black lung per year, be saved.
Rolling back some of the protections would be just the latest effort to gut Biden-era policy. On Wednesday, the White House invited automotive executives to attend what’s expected to be an announcement to shred fuel-efficiency standards for new vehicles, The New York Times reported late on Tuesday.
Sign up to receive Heatmap AM in your inbox every morning:

The average American spent a combined 11 hours without electricity last year as a result of extreme weather, worse outages than during any previous year going back a decade. That’s according to the latest analysis by the U.S. Energy Information Administration. Blackouts attributed to major events averaged nearly nine hours in 2025, compared to an average of roughly four hours per year in 2014 through 2023. Major hurricanes accounted for 80% of the hours without electricity in 2024.
The latest federal grants may be good news for third-generation SMRs, but one of the leading fourth-generation projects — the Bill Gates-owned TerraPower’s bid to build a molten salt-cooled reactor at a former coal plant in Wyoming — just cleared the final safety hurdle for its construction permit. Calling the approval a “momentous occasion for TerraPower,” CEO Chris Levesque said the “favorable safety evaluation from the U.S. Nuclear Regulatory Commission reflects years of rigorous evaluation, thoughtful collaboration with the NRC, and an unwavering commitment to both safety and innovation.”
TerraPower’s project in Kemmerer, Wyoming, is meant to demonstrate the company’s reactors, which are designed to store power when it’s needed — making them uniquely complementary to grids with large amounts of wind and solar — to avoid the possibility of a meltdown. Still, at a private lunch I attended in October, Gates warned that the U.S. is falling behind China on nuclear power. China is charging ahead on all energy fronts. On Tuesday, Bloomberg reported that the Chinese had started up a domestically-produced gas turbine for the first time as the country seeks to compete with the U.S. on even the fossil fuels American producers dominate.
It’s been a rough year for green hydrogen projects as the high cost of producing the zero-carbon fuel from renewable electricity and water makes finding customers difficult for projects. Blue hydrogen, the version of the fuel made with natural gas equipped with carbon capture equipment, isn’t doing much better. Last month, Exxon Mobil Corp. abandoned plans to build what would have been one of the world’s largest hydrogen production plants in Baytown, Texas. This week, BP withdrew from a blue hydrogen project in England. At issue are strict new standards in the European Union for how much carbon blue hydrogen plants would need to capture to qualify as clean.
You’re not the only one accidentally ingesting loads of microplastics. New research suggests crickets can’t tell the difference between tiny bits of plastics and natural food sources. Evidence shows that crickets can break down microplastics into smaller nanoplastics — which may be even worse in the environment since they’re more easily eaten or absorbed by other lifeforms.