You’re out of free articles.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Sign In or Create an Account.
By continuing, you agree to the Terms of Service and acknowledge our Privacy Policy
Welcome to Heatmap
Thank you for registering with Heatmap. Climate change is one of the greatest challenges of our lives, a force reshaping our economy, our politics, and our culture. We hope to be your trusted, friendly, and insightful guide to that transformation. Please enjoy your free articles. You can check your profile here .
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Subscribe to get unlimited Access
Hey, you are out of free articles but you are only a few clicks away from full access. Subscribe below and take advantage of our introductory offer.
subscribe to get Unlimited access
Offer for a Heatmap News Unlimited Access subscription; please note that your subscription will renew automatically unless you cancel prior to renewal. Cancellation takes effect at the end of your current billing period. We will let you know in advance of any price changes. Taxes may apply. Offer terms are subject to change.
Create Your Account
Please Enter Your Password
Forgot your password?
Please enter the email address you use for your account so we can send you a link to reset your password:
Twenty-five years ago, computers were on the verge of destroying America’s energy system.
Or, at least, that’s what lots of smart people seemed to think.
In a 1999 Forbes article, a pair of conservative lawyers, Peter Huber and Mark Mills, warned that personal computers and the internet were about to overwhelm the fragile U.S. grid.
Information technology already devoured 8% to 13% of total U.S. power demand, Huber and Mills claimed, and that share would only rise over time. “It’s now reasonable to project,” they wrote, “that half of the electric grid will be powering the digital-Internet economy within the next decade.” (Emphasis mine.)
Over the next 18 months, investment banks including JP Morgan and Credit Suisse repeated the Forbes estimate of internet-driven power demand, advising their customers to pile into utilities and other electricity-adjacent stocks. Although it was unrelated, California’s simultaneous blackout crisis deepened the sense of panic. For a moment, experts were convinced: Data centers and computers would drain the country’s energy resources.
They could not have been more wrong. In fact, Huber and Mills had drastically mismeasured the amount of electricity used by PCs and the internet. Computing ate up perhaps 3% of total U.S. electricity in 1999, not the roughly 10% they had claimed. And instead of staring down a period of explosive growth, the U.S. electric grid was in reality facing a long stagnation. Over the next two decades, America’s electricity demand did not grow rapidly — or even, really, at all. Instead, it flatlined for the first time since World War II. The 2000s and 2010s were the first decades without “load growth,” the utility industry’s jargon for rising power demand, since perhaps the discovery of electricity itself.
Now that lull is ending — and a new wave of tech-driven concerns has overtaken the electricity industry. According to its supporters and critics alike, generative artificial intelligence like ChatGPT is about to devour huge amounts of electricity, enough to threaten the grid itself. “We still don’t appreciate the energy needs of this technology,” Sam Altman, the CEO of OpenAI, has said, arguing that the world needs a clean energy breakthrough to meet AI’s voracious energy needs. (He is investing in nuclear fusion and fission companies to meet this demand.) The Washington Post captured the zeitgeist with a recent story: America, it said, “is running out of power.”
But … is it actually? There is no question that America’s electricity demand is rising once again and that load growth, long in abeyance, has finally returned to the grid: The boom in new factories and the ongoing adoption of electric vehicles will see to that. And you shouldn’t bet against the continued growth of data centers, which have increased in size and number since the 1990s. But there is surprisingly little evidence that AI, specifically, is driving surging electricity demand. And there are big risks — for utility customers and for the planet — by treating AI-driven electricity demand as an emergency.
There is, to be clear, no shortage of predictions that AI will cause electricity demand to rise. According to a recent Reuters report, nine of the country’s 10 largest utilities are now citing the “surge” in power demand from data centers when arguing to regulators that they should build more power. Morgan Stanley projects that power use from data centers “is expected to triple globally this year,” according to the same report. The International Energy Agency more modestly — but still shockingly — suggests that electricity use from data centers, AI, and cryptocurrency could double by 2026.
These concerns have also come from environmentalists. A recent report from the Climate Action Against Disinformation Commission, a left-wing alliance of groups including Friends of the Earth and Greenpeace, warned that AI will require “massive amounts of energy and water” and called for aggressive regulation.
That report focused on the risks of an AI-addled social media public sphere, which progressives fear will be filled with climate-change-denying propaganda by AI-powered bots. But in an interview, Michael Khoo, an author of the report and a researcher at Friends of the Earth, told me that studying AI made him much more frightened about its energy use.
AI is such an power-suck that it “is causing America to run out of energy,” Khoo said. “I think that’s going to be much more disruptive than the disinformation conversation in the mid-term.” He sketched a scenario where Altman and Mark Zuckerberg can outbid ordinary households for electrons as AI proliferates across the economy. “I can see people going without power,” he said, “and there being massive social unrest.”
These predictions aren’t happening in a vacuum. At the same time that investment bankers and environmentalists have fretted over a potential electricity shortage, utilities across the South have proposed a de facto solution: a massive buildout of new natural-gas power plants.
Citing the return of load growth, utilities across the South are trying to go around normal regulatory channels and build a slew of new natural-gas-burning power plants. Across at least six states, utilities have already won — or are trying to win — permission from local governments to fast-track more than 10,000 megawatts of new gas-fired power plants so that they can meet the surge in demand.
These requests have popped up across the region, pushed by vertically integrated monopoly power companies. Georgia Power won a tentative agreement to build 1,400 new megawatts of gas capacity, Canary reported. In the Carolinas, Duke Energy has asked to build 9,000 megawatts of new gas capacity, triple what it previously requested. The Tennessee Valley Authority has plans to add 6,600 megawatts of new capacity to its grid.
This buildout is big enough to endanger the country’s climate targets. Although these utilities are also building new renewable and battery farms, and shutting down coal plants, the planned surge in carbon emissions from natural gas plants would erase the reductions from those changes, according to a Southern Environmental Law Center analysis. Duke Energy has already said that it will not meet its 2030 climate goal in order to conduct the gas expansion.
In the popular press, AI’s voracious energy demand is sometimes said to be a major driver of this planned gas boom. But evidence for that proposition is slim, and the utilities have said only that data center expansion is one of several reasons for the boom. The Southeast’s population is growing, and the region is experiencing a manufacturing renaissance, due in part to the new car, battery, and solar panel factories subsidized by Biden’s climate law. Utilities in the South also face a particular challenge coping with the coldest winter mornings because so many homes and offices use inefficient and power-hungry space heaters.
Indeed, it’s hard to talk about the drivers of load growth with any specificity — and it’s hard to know whether load growth will actually happen in all corners of the South.
Utilities compete against each other to secure big-name customers — much like local governments compete with sweetheart tax deals — so when a utility asks regulators to build more capacity, it doesn’t reveal where potentialpower demand is coming from. (In other words, it doesn’t reveal who it believes will eventually buy that power.) A company might float plans to build the same data center or factory in multiple states to shop around for the best rates, which means the same underlying gigawatts of demand may be appearing in several different utilities’ resource plans at the same time. In other words, utilities are unlikely to actually see all of the demand they’re now projecting.
Even if we did know exactly how many gigawatts of new demand each utility would see, it’s almost impossible to say how much of it is coming from AI. Utilities don’t say how much of their future projected power demand will come from planned factories versus data centers. Nor do they say what each data center does and whether it trains AI (or mines Bitcoin, which remains a far bigger energy suck).
The risk of focusing on AI, specifically, as a driver of load growth is that because it’s a hot new technology — one with national security implications, no less — it can rhetorically justify expensive emergency action that is actually not necessary at all. Utilities may very well need to build more power capacity in the years to come. But does that need constitute an emergency? Does it justify seeking special permission from their statehouses or regulators to build more gas, instead of going through the regular planning process? Is it worth accelerating approvals for new gas plants? Probably not. The real danger, in other words, is not that we’ll run out of power. It’s that we’ll build too much of the wrong kind.
At the same time, we might have been led astray by overly dire predictions of AI’s energy use. Jonathan Koomey, a researcher who studies how the internet and data centers use energy (and the namesake of Koomey’s Law) told me that many estimates of Nvidia’s most important AI chips assume that their energy use is the same as their advertised “rated” power. In reality, Nvidia chips probably use half of that amount, he said, because chipmakers engineer their chips to withstand more electricity than is necessary for safety reasons.
And this is just the current generation of chips: Nvidia’s next generation of AI-training chips, called “Blackwell,” use 25 times less energy to do the same amount of computation as the previous generation of chips.
Koomey helped defuse the last panic over energy use by showing that the estimates Huber and Mills relied on were wildly incorrect. Estimates now suggest that the internet used less than 1% of total U.S. electricity by the late 1990s, not 13% as they claimed. Those percentages stayedroughly the same through 2008, he later found, even as data centers grew and computers proliferated across the economy. That’s the same year, remember, that Huber and Mills predicted that the internet would consume half of American energy.
These bad predictions were extremely convenient. Mills was a scientific advisor to the Greening Earth Society, a fossil-fuel-industry-funded group that alleged carbon dioxide pollution would actually improve the global environment. He aimed to show that climate and environmental policy would conflict with the continued growth of the internet.
“Many electricity policy proposals are on a collision course with demand forces,” Mills said in a Greening Earth press release at the time. “While many environmentalists want to substantially reduce coal use in making electricity, there is no chance of meeting future economically-driven and Internet-accelerated electric demand without retaining and expanding the coal component.” Hence the headline of the Forbes piece: “The PCs are coming — Dig more coal.”
What makes today’s AI-induced fear frenzy different from 1999 is that the alarmed projections are not just coming from businesses and banks like Morgan Stanley, but from environmentalists like Friends of the Earth. Yet neither their estimates of near-term, AI-driven power shortages — nor the analysis from Morgan Stanley that U.S. data-center use could soon triple within a year — make sense given what we know about data centers, Koomey said. It is not logistically possible to triple data centers’ electricity use in one year. “There just aren’t enough people to build data centers, and it takes longer than a year to build a new data center anyway,” he said. “There aren’t enough generators, there aren’t enough transformers — the backlog for some equipment is 24 months. It’s a supply chain constraint.”
Look around and you might notice that we have many more servers and computers today than we did in 1999 — not to mention smartphones and tablets, which didn’t even exist then — and yet computing doesn’t devour half of American energy. It doesn’t even get close. Today, computers use 1% to 4% of total U.S. power demand, depending on which estimate you trust. That’s about the same share of total U.S. electricity demand that they used in the late 1990s and mid-2000s.
It may well be that AI devours more energy in years to come, but utilities probably do not need to deal with it by building more gas. They could install more batteries, build new power lines, or even pay some customers to reduce their electricity usage during certain peak events, such as cold winter storms.
There are some places where AI-driven energy demand could be a problem — Koomey cited Ireland and Loudon County, Virginia, as two epicenters. But even there, building more natural gas is not the sole way to cope with load growth.
“The problem with this debate is everybody is kind of right,” Daniel Tait, who researches Southern utilities for the Energy and Policy Institute, a consumer watchdog, told me. “Yes, AI will increase load a little bit, but probably not as much as you think. Yes, load is growing, but maybe not as much as you say. Yes, we do need to build stuff, but maybe not the stuff that you want.”
There are real risks if AI’s energy demands get overstated and utilities go on a gas-driven bender. The first is for the planet: Utilities might overbuild gas plants now, run them even though they’re non-economic, and blow through their climate goals.
“Utilities — especially the vertically integrated monopoles in the South — have every incentive to overstate load growth, and they have a pattern of having done that consistently,” Gudrun Thompson, a senior attorney at the Southern Environmental Law Center, told me. In 2017, the Rocky Mountain Institute, an energy think tank, found in 2017 that utilities systematically overestimated their peak demand when compiling forecasts. This makes sense: Utilities would rather build too much capacity than wind up with too little, especially when they can pass along the associated costs to rate-payers.
But the second risk is that utilities could burn through the public’s willingness to pay for grid upgrades. Over the next few years, utilities should make dozens of updates to their systems. They have to build new renewables, new batteries, and new clean 24/7 power, such as nuclear or geothermal. They will have to link their grids to their neighbors’ by building new transmission lines. All of that will be expensive, and it could require the kind of investment that raises electricity rates. But the public and politicians can accept only so many rate hikes before they rebel, and there’s a risk that utilities spend through that fuzzy budget on unnecessary and wasteful projects now, not on the projects that they’ll need in the future.
There is no question that AI will use more electricity in the years to come. But so will EVs, new factories, and other sources of demand. America is on track to use more electricity. If that becomes a crisis, it will be one of our own making.
Log in
To continue reading, log in to your account.
Create a Free Account
To unlock more free articles, please create a free account.
Kettle offers parametric insurance and says that it can cover just about any home — as long as the owner can afford the premium.
Los Angeles is on fire, and it’s possible that much of the city could burn to the ground. This would be a disaster for California’s already wobbly home insurance market and the residents who rely on it. Kettle Insurance, a fintech startup focused on wildfire insurance for Californians, thinks that it can offer a better solution.
The company, founded in 2020, has thousands of customers across California, and L.A. County is its largest market. These huge fires will, in some sense, “be a good test, not just for the industry, but for the Kettle model,” Brian Espie, the company’s chief underwriting officer, told me. What it’s offering is known as “parametric” insurance and reinsurance (essentially insurance for the insurers themselves.) While traditional insurance claims can take years to fully resolve — as some victims of the devastating 2018 Camp Fire know all too well — Kettle gives policyholders 60 days to submit a notice of loss, after which the company has 15 days to validate the claim and issue payment. There is no deductible.
As Espie explained, Kettle’s AI-powered risk assessment model is able to make more accurate and granular calculations, taking into account forward-looking, climate change-fueled challenges such as out-of-the-norm weather events, which couldn’t be predicted by looking at past weather patterns alone (e.g. wildfires in January, when historically L.A. is wet). Traditionally, California insurers have only been able to rely upon historical datasets to set their premiums, though that rule changed last year and never applied to parametric insurers in the first place.
“We’ve got about 70 different inputs from global satellite data and real estate ground level datasets that are combining to predict wildfire ignition and spread, and then also structural vulnerability,” Espie told me. “In total, we’re pulling from about 130 terabytes of data and then simulating millions of fires — so using technology that, frankly, wouldn’t have been possible 10 or maybe five years ago, because either the data didn’t exist, or it just wasn’t computationally possible to run a model like we are today.”
As of writing, it’s estimated that more than 2,000 structures have burned in Los Angeles. Whenever a fire encroaches on a parcel of Kettle-insured land, the owner immediately qualifies for a payout. Unlike most other parametric insurance plans, which pay a predetermined amount based on metrics such as the water level during a flood or the temperature during a heat wave regardless of damages, Kettle does require policyholders to submit damage estimates. The company told me that’s usually pretty simple: If a house burns, it’s almost certain that the losses will be equivalent to or exceed the policy limit, which can be up to $10 million. While the company can always audit a property to prevent insurance fraud, there are no claims adjusters or other third parties involved, thus expediting the process and eliminating much of the back-and-forth wrangling residents often go through with their insurance companies.
So how can Kettle afford to do all this while other insurers are exiting the California market altogether or pulling back in fire-prone regions? “We like to say that we can put a price on anything with our model,” Espie told me. “But I will say there are parts of the state that our model sees as burning every 10 to 15 years, and premiums may be just practically too expensive for insurance in those areas.” Kettle could also be an option for homeowners whose existing insurance comes with a very high wildfire deductible, Espie explained, as buying Kettle’s no-deductible plan in addition to their regular plan could actually save them money were a fire to occur.
But just because an area has traditionally been considered risky doesn’t mean that Kettle’s premiums will necessarily be exorbitant. The company’s CEO, Isaac Espinoza, told me that Kettle’s advanced modeling allows it to drill down on the risk to specific properties rather than just general regions. “We view ourselves as ensuring the uninsurable,” Espinoza said. “Other insurers just blanket say, we don’t want to touch it. We don’t touch anything in the area. We might say, ’Hey, that’s not too bad.’”
Espie told me that the wildly destructive fires in 2017 and 2018 “gave people a wake up call that maybe some of the traditional catastrophe models out there just weren’t keeping up with science and natural hazards in the face of climate change.” He thinks these latest blazes could represent a similar turning point for the industry. “This provides an opportunity for us to prove out that models built with AI and machine learning like ours can be more predictive of wildfire risk in the changing climate, where we’re getting 100 mile per hour winds in January.”
Everyone knows the story of Mrs. O’Leary’s cow, the one that allegedly knocked over a lantern in 1871 and burned down 2,100 acres of downtown Chicago. While the wildfires raging in Los Angeles County have already far exceeded that legendary bovine’s total attributed damage — at the time of this writing, on Thursday morning, five fires have burned more than 27,000 acres — the losses had centralized, at least initially, in the secluded neighborhoods and idyllic suburbs in the hills above the city.
On Wednesday, that started to change. Evacuation maps have since extended into the gridded streets of downtown Santa Monica and Pasadena, and a new fire has started north of Beverly Hills, moving quickly toward an internationally recognizable street: Hollywood Boulevard. The two biggest fires, Palisades and Eaton, remain 0% contained, and high winds have stymied firefighting efforts, all leading to an exceedingly grim question: Exactly how much of Los Angeles could burn. Could all of it?
“I hate to be doom and gloom, but if those winds kept up … it’s not unfathomable to think that the fires would continue to push into L.A. — into the city,” Riva Duncan, a former wildland firefighter and fire management specialist who now serves as the executive secretary of Grassroots Wildland Firefighters, an advocacy group, told me.
When a fire is burning in the chaparral of the hills, it’s one thing. But once a big fire catches in a neighborhood, it’s a different story. Houses, with their wood frames, gas lines, and cheap modern furniture, might as well be Duraflame. Embers from one burning house then leap to the next and alight in a clogged gutter or on shrubs planted too close to vinyl siding. “That’s what happened with the Great Chicago Fire. When the winds push fires like that, it’s pushing the embers from one house to the others,” Duncan said. “It’s a really horrible situation, but it’s not unfathomable to think about that [happening in L.A.] — but people need to be thinking about that, and I know the firefighters are thinking about that.”
Once flames engulf a block, it will “overpower” the capabilities of firefighters, Arnaud Trouvé, the chair of the Department of Fire Protection Engineering at the University of Maryland, told me in an email. If firefighters can’t gain a foothold, the fire will continue to spread “until a change in driving conditions,” such as the winds weakening to the point that a fire isn’t igniting new fuel or its fuel source running out entirely, when it reaches something like an expansive parking lot or the ocean.
This waiting game sometimes leads to the impression that firefighters are standing around, not doing anything. But “what I know they’re doing is they’re looking ahead to places where maybe there’s a park, or some kind of green space, or a shopping center with big parking lots — they’re looking for those places where they could make a stand,” Duncan told me. If an entire city block is already on fire, “they’re not going to waste precious water there.”
Urban firefighting is a different beast than wildland firefighting, but Duncan noted that Forest Service, CALFIRE, and L.A. County firefighters are used to complex mixed environments. “This is their backyard, and they know how to fight fire there.”
“I can guarantee you, many of them haven’t slept 48 hours,” she went on. “They’re grabbing food where they can; they’re taking 15-minute naps. They’re in this really horrible smoke — there are toxins that come off burning vehicles and burning homes, and wildland firefighters don’t wear breathing apparatus to protect the airways. I know they all have horrible headaches right now and are puking. I remember those days.”
If there’s a sliver of good news, it’s that the biggest fire, Palisades, can’t burn any further to the west, the direction the wind is blowing — there lies the ocean — meaning its spread south into Santa Monica toward Venice and Culver City or Beverly Hills is slower than it would be if the winds shifted. The westward-moving Santa Ana winds, however, could conceivably fan the Eaton fire deeper into eastern Los Angeles if conditions don’t let up soon. “In many open fires, the most important factor is the wind,” Trouvé explained, “and the fire will continue spreading until the wind speed becomes moderate-to-low.”
Though the wind died down a bit on Wednesday night, conditions are expected to deteriorate again Thursday evening, and the red flag warning won’t expire until Friday. And “there are additional winds coming next week,” Kristen Allison, a fire management specialist with the Southern California Geographic Area Coordination Center, told me Wednesday. “It’s going to be a long duration — and we’re not seeing any rain anytime soon.”
Editor’s note: Firefighting crews made “big gains” overnight against the Sunset fire, which threatened famous landmarks like the TLC Chinese Theater and the Dolby Theatre, which will host the Academy Awards in March. Most of the mandatory evacuation notices remaining in Hollywood on Thursday morning were out of precaution, the Los Angeles Times reported. Meanwhile, the Palisades and Eaton fires have burned a combined 27,834 acres, destroyed 2,000 structures, killed at least five people, and remain unchecked as the winds pick up again. This piece was last updated on January 9 at 10:30 a.m. ET.
On greenhouse gases, LA’s fires, and the growing costs of natural disasters
Current conditions: Winter storm Cora is expected to disrupt more than 5,000 U.S. flights • Britain’s grid operator is asking power plants for more electricity as temperatures plummet • Parts of Australia could reach 120 degrees Fahrenheit in the coming days because the monsoon, which usually appears sometime in December, has yet to show up.
The fire emergency in Los Angeles continues this morning, with at least five blazes raging in different parts of the nation’s second most-populated city. The largest, known as the Palisades fire, has charred more than 17,000 acres near Malibu and is now the most destructive fire in the county’s history. The Eaton fire near Altadena and Pasadena has grown to 10,600 acres. Both are 0% contained. Another fire ignited in Hollywood but is reportedly being contained. At least five people have died, more than 2,000 structures have been destroyed or damaged, 130,000 people are under evacuation warnings, and more than 300,000 customers are without power. Wind speeds have come down from the 100 mph gusts reported yesterday, but “high winds and low relative humidity will continue critical fire weather conditions in southern California through Friday,” the National Weather Service said.
Apu Gomes/Getty Images
As the scale of this disaster comes into focus, the finger-pointing has begun. President-elect Donald Trump blamed California Gov. Gavin Newsom, suggesting his wildlife protections have restricted the city’s water access. Many people slammed the city’s mayor for cutting the fire budget. Some suspect power lines are the source of the blazes, implicating major utility companies. And of course, underlying it all, is human-caused climate change, which researchers warn is increasing the frequency and severity of wildfires. “The big culprit we’re suspecting is a warming climate that’s making it easier to burn fuels when conditions are just right,” said University of Colorado fire scientist Jennifer Balch.
America’s greenhouse gas emissions were down in 2024 compared to 2023, but not by much, according to the Rhodium Group’s annual report, released this morning. The preliminary estimates suggest emissions fell by just 0.2% last year. In other words, they were basically flat. That’s good news in the sense that emissions didn’t rise, even as the economy grew by an estimated 2.7%. But it’s also a little worrying given that in 2023, emissions dropped by 3.3%.
Rhodium Group, EPA
The transportation, power, and buildings sectors all saw upticks in emissions last year. But there are some bright spots in the report. Emissions fell across the industrial sector (down 1.8%) and oil and gas sector (down 3.7%). Solar and wind power generation surpassed coal for the first time, and coal production fell by 12% to its lowest level in decades, resulting in fewer industrial methane emissions. Still, “the modest 2024 decline underscores the urgency of accelerating decarbonization in all sectors,” Rhodium’s report concluded. “To meet its Paris Agreement target of a 50-52% reduction in emissions by 2030, the U.S. must sustain an ambitious 7.6% annual drop in emissions from 2025 to 2030, a level the U.S. has not seen outside of a recession in recent memory.”
Insured losses from natural disasters topped $140 billion last year, up significantly from $106 billion in 2023, according to Munich Re, the world’s largest insurer. That makes 2024 the third most expensive year in terms of insured losses since 1980. Weather disasters, and especially major U.S. hurricanes, accounted for a large chunk ($47 billion) of these costs: Hurricanes Helene and Milton were the most devastating natural disasters of 2024. “Climate change is taking the gloves off,” the insurer said. “Hardly any other year has made the consequences of global warming so clear.”
Munich Re
A new study found that a quarter of all the world’s freshwater animals are facing a high risk of extinction due to pollution, farming, and dams. The research, published in the journal Nature, explained that freshwater sources – like rivers, lakes, marshes, and swamps – support over 10% of all known species, including fish, shrimps, and frogs. All these creatures support “essential ecosystem services,” including climate change mitigation and flood control. The report studied some 23,000 animals and found about 24% of the species were at high risk of extinction. The researchers said there “is urgency to act quickly to address threats to prevent further species declines and losses.”
A recent oil and gas lease sale in Alaska’s Arctic National Wildlife Refuge got zero bids, the Interior Department announced yesterday. This was the second sale – mandated by Congress under the 2017 Tax Act – to generate little interest. “The lack of interest from oil companies in development in the Arctic National Wildlife Refuge reflects what we and they have known all along – there are some places too special and sacred to put at risk with oil and gas drilling,” said Acting Deputy Secretary Laura Daniel-Davis. President-elect Donald Trump has promised to open more drilling in the refuge, calling it “the biggest find anywhere in the world, as big as Saudi Arabia.”
“Like it or not, addressing climate change requires the help of the wealthy – not just a small number of megadonors to environmental organizations, but the rich as a class. The more they understand that their money will not insulate them from the effects of a warming planet, the more likely they are to be allies in the climate fight, and vital ones at that.” –Paul Waldman writing for Heatmap