You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:

Of all the imaginative ways to die in New York City — getting pushed in front of a subway car, flattened by a falling a/c unit, clocked by an exploding manhole cover, etc. — perhaps the unlikeliest is Death By Toxic Black Mold.
That hasn’t stopped me from thinking about it ... all the time. Every New Yorker seems to know someone who’s discovered the inky starbursts in their building and had months of migraines, runny noses, and sore throats snap into horrible clarity. Toxic black mold. With a name like that, how could you not be terrified?
Fungi have been a little more top-of-mind lately, though, because they’re everywhere.
I mean that beyond the literal sense that “fungi are everywhere,” which they also are: We’ve found them in Antarctica, gnawing through Shackleton and Scott’s century-old huts; at the bottom of the ocean, in multi-million-year-old mud; on antiseptically clean hospital walls; and at the site of the Chernobyl nuclear disaster. Naturally, they survive “surprisingly well” in space.
Over the past decade or so, fungi have begun to infest our stories as well. This is particularly true of horror and sci-fi, including HBO’s recent The Last of Us adaptation, which expands on the 2013 game’s fungal zombie backstory. In 2017, Star Trek: Discovery introduced the idea that the whole universe is connected by mycelia, a concept explained to viewers by the fictional astromycologist Paul Stamets — not to be confused with Eldon Stammets, the mushroom-obsessed serial killer from season one of Hannibal (2013), nor the real mycologist Paul Stamets, after whom both characters were named (Bryan Fuller, a Stamets superfan, worked on both shows). Other memorable fungal sightings in fiction include Mike Carey’s The Girl With All the Gifts (2014); multiple Jeff VanderMeers but perhaps most obviously Annihilation (2014, with a film adaptation in 2018); Silvia Moreno-Garcia’s Mexican Gothic (2020); and N. K. Jemisin’s The City We Became (2020) — though there are many more. Taking a full inventory, it can almost seem as if, over the course of about a decade, writers collectively realized fungi are the perfect monsters: efficient, unknowable, hungry.
On the one hand, of course. We’re repelled by mold and mushrooms for the same reason we’re disgusted by rats or insects: They are symbols of death, disease, and decay, a reminder that in the end, we’re nothing more than fleshy neighborhoods for “postmortem fungal communities.”
But if there is something primordial about our fungus revulsion, there is something obtuse about it, too. Our lives have been entangled with fungi’s for as long as we’ve been human. The oldest dental records ever studied, belonging to cannibalized 50,000-year-old Neanderthals, indicate ancient hominids ate “primitive penicillin,” possibly for the same medical purposes that we use the mold-derived antibiotic today. Otzi the Iceman was wearing Birch polypores on a leather thong around his neck when he died. Some (admittedly fringe) scientists even believe mushrooms were the spark that set our Homo erectus ancestors on their journey to the higher consciousness of Homo sapiens.
What, then, soured in our multi-millennia-long human-fungus relationship to make us — as mycologist David Arora puts it — the “fungophobic society” we are today? The medical community’s acceptance of germ theory, and our modern obsession with cleanliness, are components, surely.
There is another possibility, too: The closer we’ve looked at fungi, the stranger they reveal themselves to be, and the richer and more possible our wildest fictions become.
Mushrooms might seem to sprout abruptly and at random. But in truth, they’re just the visible fruiting body of a much larger subterranean organism. Great speculative fiction works much the same way: While a story can appear to have sprouted from nothing, it’s been fed, just below the surface, by a tangle of science, headlines, and current events.
In the aftermath of the Hiroshima bombing in 1945, for example, fiction warped the horrors of nuclear science for films like Godzilla (1954), Them! (1954), and Tarantula (1955). And after the moon landing in 1969, Star Wars (1977), Close Encounters of the Third Kind (1977), and Alien (1979) all wondered who else might be up there?
When it comes to mycology, though, science is still getting started. Fungi didn’t even become their own taxonomic kingdom until 1969; before then, scientists just thought they were really weird plants.
Westerners have long approached fungi with suspicion. “The fields were spotted with monstrous fungi of a size and colour never matched before … Death sprang also from the water-soaked earth,” Arthur Conan Doyle wrote in Sir Nigel (1905-06), using fungi as an ominous mood-setter. Edgar Allen Poe wasn’t a fan either: “Minute fungi overspread the whole exterior” of the House of Usher, he wrote in 1839, “hanging in a fine tangled web-work from the eaves.” Folk explanations posited that mushrooms shot from the ground where lightning struck, and “a vast body of Victorian fairy lore connected mushrooms and toadstools with elves, pixies, hollow hills, and the unwitting transport of subjects to fairyland,” explains Mike Jay in The Public Domain Review.
Brits were especially revolted by the “pariahs of the plant world,” to the great disappointment of R.T. Rolfe, who penned a rousing 1925 defense titled Romance of the Fungal World. In Shakespeare’s day, it was questionable if mushrooms were even safely edible; “a hogg wont touch um,” warned Edmund Gayton in his 1695 Art of Longevity. Americans inherited this wariness — “the general opinion [in the U.S. is] all forms of fungus growth are either poisonous or unwholesome,” observed one cookbook writer in 1899 — though many were beginning to come around by the late 19th century, taking cues from the more adventurous eaters of France. Not every culture has been quite so squeamish: mushrooms have long been cultivated in Asia; are a staple of Eastern European, African, and Slavic cuisines; and Indigenous groups throughout the Americas have likewise long enjoyed all that fungi have to offer.
The reevaluation of fungi in refined English society came about almost entirely by accident, via the fortuitous contamination of Alexander Fleming’s staphylococci cultures by the genus Penicillium in 1928. Still, it wouldn’t be until the second half of the 20th century when fungus science really started to get weird — even weirder, you might say, than fiction.
Because the fungi, it appeared, were talking to each other.
When ecologist Suzanne Simard captured the public imagination by describing in a 1997 issue of Nature how trees use webs of underground fungi to communicate with each other, networks — conceptually — were already having a moment. The internet, and the “network of cables and routers” that comprised it, had been around since the 1970s, mycologist Merlin Sheldrake explains in Entangled Life, but when the World Wide Web became available to users in 1991, network science started informing everything from epidemiology to neuroscience. Nature tapped into this buzz by coining the “Wood Wide Web” on its cover to describe Simard’s research, and in doing so, mesmerizingly blurred science-fiction, tech, and biology.
The oft-quoted theory of the Wood Wide Web suggests that fungal threads called mycelium colonize root systems of forest trees, and in doing so, facilitate the exchange of defense signals and other “wisdom” by moving nutrients between plants. “Mother” trees, for example, can supposedly nurture samplings in their communities by shipping excess carbon via fungi. Reviewer Philip Ball went as far as to marvel in Prospect, after reading an account of these and other systems in Sheldrake’s Entangled Life, that “fungi force us to reconsider what intelligence even means.” (Sheldrake’s enthusiasm for the Wood Wide Web is more restrained; he uses it disparagingly to illustrate “plant-centrism in action”).
Ball wasn’t the only one awed, though. References to the “alien language” of fungi began popping up everywhere in popular science writing, as McMaster University’s Derek Woods has observed. Paul Stamets’ Mycelium Running helped bring Simard’s research to a more general audience in 2005, while Peter Wohlleben’s The Hidden Life of Trees (2015), and Simard’s own Finding the Mother Tree (2021) followed — not to mention “dozens of imitative articles,” TED talks, documentaries, and offshoot studies. As recently as last year, The Guardian was trumpeting that “Mushrooms communicate with each other using up to 50 ‘words’.”
Some scientists have since raised doubts about the Wood Wide Web, characterizing the research as potentially “overblown” and “unproven" — but it’s a good story, isn’t it? Not to mention a rich jumping-off point for writers who were paying attention to the headlines. One can trace a line directly from Simard’s research, through Stamets’ amplification, straight to Bryan Fuller’s mycelium plane in Star Trek: Discovery.
Yet the phenomenon, as described, sounds far more Edenic than the terrifying, often sentient, man-eating, mind-controlling, city-conquering fungi that have overwhelmingly appeared in modern sci-fi and horror. Is today’s fungal antagonist just a product of those centuries of folk superstitions? Or is something else in the zeitgeist making our skin crawl?
Let’s return, for a moment, to the ways I’ve imagined dying in New York City.
Though the chances of being taken out by a subway or an unsecured a/c unit are slim, they have, tragically, actually happened. But when you start to look into Deaths by Toxic Black Mold, the picture gets a lot murkier.
Few people, verging on none, have definitively died of black mold exposure. You wouldn’t know that, though, from the headlines of the early aughts, which are peppered with celebrity lawsuits over mold, culminating in TMZ tying the mysterious 2009 and 2010 deaths of Clueless actress Brittany Murphy and her husband to mold inhalation (ultimately disproven by their autopsies).
But mold hysteria didn’t originate in Beverly Hills. It comes from Ohio. In the mid ’90s, 12 babies in Cleveland died of lung hemorrhaging and the main suspect was an outbreak of black mold allegedly brought on by unusually heavy rains. CDC investigators found all of the afflicted infants lived in homes with bad water damage, and, in many cases, those homes also had Stachybotrys, a moisture-loving black mold. Soon, stories linking the fungus to the deaths were making national news.
Reevaluations of the outbreak later cast doubt on the correlation. In 1999, the CDC walked back its initial assessment, citing “serious shortcomings in the collection, analysis, and reporting of data.” More skepticism followed: If Stachybotrys is common wherever there is water-damaged wood, why were only babies in the Cleveland area being affected? And how do you explain that some of the babies lived in homes where no Stachybotrys was ever found?
Still, the story stuck, and the link between black mold and a whole host of health problems, including many that remain completely unproven, took root in the public consciousness. Soon, everyone was suing over black mold. “A single insurance company handled 12 cases in 1999,” mycologist Nicholas Money writes in Carpet Monsters and Killer Spores; by 2001, “the company fielded more than 10,000 claims.” The Washington Post likewise observed in 2013 that “experts say mold is not more prevalent these days; instead, we are more aware of it.”
Hypochondriacs eyeing mildew spots on their bathroom ceilings weren’t the only ones reading about deadly mold, of course. Writers were, too. And now fungi had two strikes against them: They possessed a weird alien intelligence and they were dangerous.
Then came the possibility they could control our minds.
The parasitic fungal genus Ophiocordyceps is at least 48 million years old. It has likely survived as long as it has because of its stranger-than-fiction method of propagating: Ophiocordyceps spores infect an ant and “hijack” its brain, forcing it to abandon its colony, climb a high leaf, and affix itself there with a bite. The ant then dies, still clinging to the leaf with its jaws, and the fungus sprouts out of its body, raining spores down onto other unlucky ants.
Humans turning into, or being consumed alive by, fungi had long fascinated writers (see: “The Voice in the Night” by William Hope Hodgson from 1907, or Stephen King’s 1973 “Gray Matter”). But with our increased cultural awareness of Ophiocordyceps in the 21st century, fungal mind control went from being a revolting body horror trope to a plausible sci-fi starting point. Neil Druckmann, the creative director of The Last of Us, has said he learned about the fungus from a 2008 episode of BBC’s Planet Earth, and he went on to use it as the basis for the zombies in his 2013 video game.
Though Druckmann was an early adopter of Ophiocordyceps, the fungus didn’t exactly remain obscure. “Zombie fungi are not known to use humans as hosts. At least yet,” The Columbus Dispatch wrote in 2014 (and filed, cryptically, in its “how to” section). The X-Men comics introduced “Cordyceps Jones,” a “talking parasitic fungal spore, intergalactic casino proprietor, and notorious crime boss,” as a new villain in 2021. The New York Times even saw fit to inform its readers, “After This Fungus Turns Ants Into Zombies, Their Bodies Explode.” Try scrolling past that.
Through this process of scientific discoveries, eye-catching headlines, and a little exaggeration, it took only a handful of decades for fungi to make the leap from “pariahs of the plant world” to the perfect horror villain. The climate crisis will likely be a further creative accelerant. Thanks to intensified hurricanes and flooding, mold will be an ongoing issue in homes nationwide. Plus, fungi are nothing if not survivors, and some are already pushing past the climatological boundaries — and antifungals — that used to contain them.
Even The Last of Us added an explanation in the HBO adaption that the warming planet is what allowed Ophiocordyceps to evolve and make the leap from cooler-bodied insects to comparatively hot humans. The good news is, mycologists say this is all but impossible in real life due to the vast biological differences between humans and ants; the bad news is, a deadly fungal pandemic is absolutely possible and, shocker, experts say we’re not at all prepared for it.
At least, not institutionally. Fiction has already hashed out how Fauna vs. Funga could go in a hundred different ways. Sometimes, the fungus comes to us from outer space. Sometimes, it possesses alien sentience; other times, it just represents the indifferent efficiency of nature. Sometimes, it takes over our minds and turns us against each other. Sometimes, it brings us together to fight back.
Fiction is also beginning to wonder if those villainous fungi might just be our friends. Think of those universe-binding spores that connect us in Star Trek, or the fungal-facilitated hivemind in a popular Hugo Award-winning series, which likewise eludes a straightforward antagonist narrative. It only makes sense: If spores are intelligent colonizers, well, so are we. Maybe the next step will be to put our heads — or at least, our hyphae and neurons — together.
Because while science reveals fungi to be weirder by the day, it also further reinforces that we can’t live without them. They nourish us, heal us, relieve us, protect us, and one day, maybe, will save us.
And oh, how they entertain us.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
In practice, direct lithium extraction doesn’t quite make sense, but 2026 could its critical year.
Lithium isn’t like most minerals.
Unlike other battery metals such as nickel, cobalt, and manganese, which are mined from hard-rock ores using drills and explosives, the majority of the world’s lithium resources are found in underground reservoirs of extremely salty water, known as brine. And while hard-rock mining does play a major role in lithium extraction — the majority of the world’s actual production still comes from rocks — brine mining is usually significantly cheaper, and is thus highly attractive wherever it’s geographically feasible.
Reaching that brine and extracting that lithium — so integral to grid-scale energy storage and electric vehicles alike — is typically slow, inefficient, and environmentally taxing. This year, however, could represent a critical juncture for a novel process known as Direct Lithium Extraction, or DLE, which promises to be faster, cleaner, and capable of unlocking lithium across a wider range of geographies.
The traditional method of separating lithium from brine is straightforward but time-consuming. Essentially, the liquid is pumped through a series of vast, vividly colored solar evaporation ponds that gradually concentrate the mineral over the course of more than a year.
It works, but by the time the lithium is extracted, refined, and ready for market, both the demand and the price may have shifted significantly, as evidenced by the dramatic rise and collapse of lithium prices over the past five years. And while evaporation ponds are well-suited to the arid deserts of Chile and Argentina where they’re most common, the geology, brine chemistry, and climate of the U.S. regions with the best reserves are generally not amenable to this approach. Not to mention the ponds require a humongous land footprint, raising questions about land use and ecological degradation.
DLE forgoes these expansive pools, instead pulling lithium-rich brine into a processing unit, where some combination of chemicals, sorbents, or membranes isolate and extricate the lithium before the remaining brine gets injected back underground. This process can produce battery-grade lithium in a matter of hours or days, without the need to transport concentrated brine to separate processing facilities.
This tech has been studied for decades, but aside from a few Chinese producers using it in combination with evaporation ponds, it’s largely remained stuck in the research and development stage. Now, several DLE companies are looking to build their first commercial plants in 2026, aiming to prove that their methods can work at scale, no evaporation ponds needed.
“I do think this is the year where DLE starts getting more and more relevant,” Federico Gay, a principal lithium analyst at Benchmark Mineral Intelligence, told me.
Standard Lithium, in partnership with oil and gas major Equinor, aims to break ground this year on its first commercial facility in Arkansas’s lithium-rich Smackover Formation, while the startup Lilac Solution also plans to commence construction on a commercial plant at Utah’s Great Salt Lake. Mining giant Rio Tinto is progressing with plans to build a commercial DLE facility in Argentina, which is already home to one commercial DLE plant — the first outside of China. That facility is run by the French mining company Eramet, which plans to ramp production to full capacity this year.
If “prices are positive” for lithium, Gay said, he expects that the industry will also start to see mergers and acquisitions this year among technology providers and larger corporations such as mining giants or oil and gas majors, as “some of the big players will try locking in or buying technology to potentially produce from the resources they own.” Indeed, ExxonMobil and Occidental Petroleum are already developing DLE projects, while major automakers have invested, too.
But that looming question of lithium prices — and what it means for DLE’s viability — is no small thing. When EV and battery storage demand boomed at the start of the decade, lithium prices climbed roughly 10-fold through 2022 before plunging as producers aggressively ramped output, flooding the market just as EV demand cooled. And while prices have lately started to tick upward again, there’s no telling whether the trend will continue.
“Everyone seems to have settled on a consensus view that $20,000 a tonne is where the market’s really going to be unleashed,” Joe Arencibia, president of the DLE startup Summit Nanotech, told me, referring to the lithium extraction market in all of its forms — hard rock mining, traditional brine, and DLE. “As far as we’re concerned, a market with $14,000, $15,000 a tonne is fine and dandy for us.”
Lilac Solutions, the most prominent startup in the DLE space, expects that its initial Utah project — which will produce a relatively humble 5,000 metric tons of lithium per year — will be profitable even if lithium prices hit last year’s low of $8,300 per metric ton. That’s according to the company’s CEO Raef Sully, who also told me that because Utah’s reserves are much lower grade than South America’s, Lilac could produce lithium for a mere $3,000 to $3,500 in Chile if it scaled production to 15,000 or 20,000 metric tons per year.
What sets Lilac apart from other DLE projects is its approach to separating lithium from brine. Most companies are pursuing adsorption-based processes, in which lithium ions bind to an aluminum-based sorbent, which removes them from surrounding impurities. But stripping the lithium from the sorbent generally requires a good deal of freshwater, which is not ideal given that many lithium-rich regions are parched deserts.
Lilac’s tech relies on an ion-exchange process in which small ceramic beads selectively capture lithium ions from the brine in their crystalline structure, swapping them for hydrogen ions. “The crystal structure seems to have a really strong attraction to lithium and nothing else,” Sully told me. Acid then releases the concentrated lithium. When compared with adsorption-based tech, he explained, this method demands far fewer materials and is “much more selective for lithium ions versus other ions,” making the result purer and thus cheaper to process into a battery-grade material.
Because adsorption-based DLE is already operating commercially and ion-exchange isn’t, Lilac has much to prove with its first commercial facility, which is expected to finalize funding and begin construction by the middle of this year.
Sully estimates that Lilac will need to raise around $250 million to build its first commercial facility, which has already been delayed due to the price slump. The company’s former CEO and current CTO Dave Snydacker told me in 2023 that he expected to commence commercial operations by the end of 2024, whereas now the company plans to bring its Utah plant online at the end of 2027 or early 2028.
“Two years ago, with where the market was, nobody was going to look at that investment,” Sully explained, referring to its commercial plant. Investors, he said, were waiting to see what remained after the market bottomed out, which it now seems to have done. Lilac is still standing, and while there haven’t yet been any public announcements regarding project funding, Sully told me he’s confident that the money will come together in time to break ground in mid-2026.
It also doesn’t hurt that lithium prices have been on the rise for a few months, currently hovering around $20,000 per tonne. Gay thinks prices are likely to stabilize somewhere in this range, as stakeholders who have weathered the volatility now have a better understanding of the market.
At that price, hard rock mining would be a feasible option, though still more expensive than traditional evaporation ponds and far above what DLE producers are forecasting. And while some mines operated at a loss or mothballed their operations during the past few years, Gay thinks that even if prices stabilize, hard-rock mines will continue to be the dominant source of lithium for the foreseeable future due to sustained global investment across Africa, Brazil, Australia, and parts of Asia. The price may be steeper, but the infrastructure is also well-established and the economics are well-understood.
“I’m optimistic and bullish about DLE, but probably it won’t have the impact that it was thought about two or three years ago,” Gay told me, as the hype has died down and prices have cooled from their record high of around $80,000 per tonne. By 2040, Benchmark forecasts that DLE will make up 15% to 20% of the lithium market, with evaporation ponds continuing to be a larger contributor for the next decade or so, primarily due to the high upfront costs of DLE projects and the time required for them to reach economies of scale.
On average, Benchmark predicts that this tech will wind up in “the high end of the second quartile” of the cost curve, making DLE projects a lower mid-cost option. “So it’s good — not great, good. But we’ll have some DLE projects in the first quartile as well, so competing with very good evaporation assets,” Gay told me.
Unsurprisingly, the technology companies themselves are more bullish on their approach. Even though Arencibia predicts that evaporation ponds will continue to be about 25% cheaper, he thinks that “the majority of future brine projects will be DLE,” and that DLE will represent 25% or more of the future lithium market.
That forecast comes in large part because Chile — the world’s largest producer of lithium from brine — has stated in its National Lithium Strategy that all new projects should have an “obligatory requirement” to use novel, less ecologically disruptive production methods. Other nations with significant but yet-to-be exploited lithium brine resources, such as Bolivia, could follow suit.
Sully is even more optimistic, predicting that as lithium demand grows from about 1.5 million metric tons per year to around 3.5 million metric tons by 2035, the majority of that growth will come from DLE. “I honestly believe that there will be no more hard rock mines built in Australia or the U.S.,” he said, telling me that in ten years time, half of our lithium supply could “easily” come from DLE.
As a number of major projects break ground this year and the big players start consolidating, we’ll begin to get a sense of whose projections are most realistic. But it won’t be until some of these projects ramp up commercial production in the 2028 to 2030 timeframe that DLE’s market potential will really crystalize.
“If you’re not a very large player at the moment, I think it’s very difficult for you to proceed,” Sully told me, reflecting on how lithium’s price shocks have rocked the industry. Even with lithium prices ticking precariously upwards now, the industry is preparing for at least some level of continued volatility and uncertainty.
“Long term, who knows what [prices are] going to be,” Sully said. “I’ve given up trying to predict.”
A chat with CleanCapital founder Jon Powers.
This week’s conversation is with Jon Powers, founder of the investment firm CleanCapital. I reached out to Powers because I wanted to get a better understanding of how renewable energy investments were shifting one year into the Trump administration. What followed was a candid, detailed look inside the thinking of how the big money in cleantech actually views Trump’s war on renewable energy permitting.
The following conversation was lightly edited for clarity.
Alright, so let’s start off with a big question: How do investors in clean energy view Trump’s permitting freeze?
So, let’s take a step back. Look at the trend over the last decade. The industry’s boomed, manufacturing jobs are happening, the labor force has grown, investments are coming.
We [Clean Capital] are backed by infrastructure life insurance money. It’s money that wasn’t in this market 10 years ago. It’s there because these are long-term infrastructure assets. They see the opportunity. What are they looking for? Certainty. If somebody takes your life insurance money, and they invest it, they want to know it’s going to be there in 20 years in case they need to pay it out. These are really great assets – they’re paying for electricity, the panels hold up, etcetera.
With investors, the more you can manage that risk, the more capital there is out there and the better cost of capital there is for the project. If I was taking high cost private equity money to fund a project, you have to pay for the equipment and the cost of the financing. The more you can bring down the cost of financing – which has happened over the last decade – the cheaper the power can be on the back-end. You can use cheaper money to build.
Once you get that type of capital, you need certainty. That certainty had developed. The election of President Trump threw that into a little bit of disarray. We’re seeing that being implemented today, and they’re doing everything they can to throw wrenches into the growth of what we’ve been doing. They passed the bill affecting the tax credits, and the work they’re doing on permitting to slow roll projects, all of that uncertainty is damaging the projects and more importantly costs everyone down the road by raising the cost of electricity, in turn making projects more expensive in the first place. It’s not a nice recipe for people buying electricity.
But in September, I went to the RE+ conference in California – I thought that was going to be a funeral march but it wasn’t. People were saying, Now we have to shift and adjust. This is a huge industry. How do we get those adjustments and move forward?
Investors looked at it the same way. Yes, how will things like permitting affect the timeline of getting to build? But the fundamentals of supply and demand haven’t changed and in fact are working more in favor of us than before, so we’re figuring out where to invest on that potential. Also, yes federal is key, but state permitting is crucial. When you’re talking about distributed generation going out of a facility next to a data center, or a Wal-Mart, or an Amazon warehouse, that demand very much still exists and projects are being built in that middle market today.
What you’re seeing is a recalibration of risk among investors to understand where we put our money today. And we’re seeing some international money pulling back, and it all comes back to that concept of certainty.
To what extent does the international money moving out of the U.S. have to do with what Trump has done to offshore wind? Is that trade policy? Help us understand why that is happening.
I think it’s not trade policy, per se. Maybe that’s happening on the technology side. But what I’m talking about is money going into infrastructure and assets – for a couple of years, we were one of the hottest places to invest.
Think about a European pension fund who is taking money from a country in Europe and wanting to invest it somewhere they’ll get their money back. That type of capital has definitely been re-evaluating where they’ll put their money, and parallel, some of the larger utility players are starting to re-evaluate or even back out of projects because they’re concerned about questions around large-scale utility solar development, specifically.
Taking a step back to something else you said about federal permitting not being as crucial as state permitting–
That’s about the size of the project. Huge utility projects may still need federal approvals for transmission.
Okay. But when it comes to the trendline on community relations and social conflict, are we seeing renewable energy permitting risk increase in the U.S.? Decrease? Stay the same?
That has less to do with the administration but more of a well-structured fossil fuel campaign. Anti-climate, very dark money. I am not an expert on where the money comes from, but folks have tried to map that out. Now you’re even seeing local communities pass stuff like no energy storage [ordinances].
What’s interesting is that in those communities, we as an industry are not really present providing facts to counter this. That’s very frustrating for folks. We’re seeing these pass and honestly asking, Who was there?
Is the federal permitting freeze impacting investment too?
Definitely.
It’s not like you put money into a project all at once, right? It happens in these chunks. Let’s say there’s 10 steps for investing in a project. A little bit of money at step one, more money at step two, and it gradually gets more until you build the project. The middle area – permitting, getting approval from utilities – is really critical to the investments. So you’re seeing a little bit of a pause in when and how we make investments, because we sometimes don’t know if we’ll make it to, say, step six.
I actually think we’ll see the most impact from this in data center costs.
Can you explain that a bit more for me?
Look at northern Virginia for a second. There wasn’t a lot of new electricity added to that market but you all of the sudden upped demand for electricity by 20 percent. We’re literally seeing today all these utilities putting in rate hikes for consumers because it is literally a supply-demand question. If you can’t build new supply, it's going to be consumers paying for it, and even if you could build a new natural gas plant – at minimum that will happen four-to-six years from now. So over the next four years, we’ll see costs go up.
We’re building projects today that we invested in two years ago. That policy landscape we invested in two years ago hasn’t changed from what we invested into. But the policy landscape then changed dramatically.
If you wipe out half of what was coming in, there’s nothing backfilling that.
Plus more on the week’s biggest renewables fights.
Shelby County, Indiana – A large data center was rejected late Wednesday southeast of Indianapolis, as the takedown of a major Google campus last year continues to reverberate in the area.
Dane County, Wisconsin – Heading northwest, the QTS data center in DeForest we’ve been tracking is broiling into a major conflict, after activists uncovered controversial emails between the village’s president and the company.
White Pine County, Nevada – The Trump administration is finally moving a little bit of renewable energy infrastructure through the permitting process. Or at least, that’s what it looks like.
Mineral County, Nevada – Meanwhile, the BLM actually did approve a solar project on federal lands while we were gone: the Libra energy facility in southwest Nevada.
Hancock County, Ohio – Ohio’s legal system appears friendly for solar development right now, as another utility-scale project’s permits were upheld by the state Supreme Court.