Imagine you're buying a new laptop. You come across a model that can do some pretty nifty stuff, but uses many times more electricity than your current laptop. (Ten times? Thirty times? No salesman can give you the exact number, because it's a company secret.)
Oh yeah, and this laptop comes with a funnel on top; every time you ask it for a joke, or a fun image you just thought up, the PC needs a water refill (again, it won't tell you how much). What do you think, worth the upgrade? For those of us who care about an ever-warmer, ever-thirstier Earth, probably not.
Yet that laptop, or something like it, is the net result of our current AI gold rush. Wait, something like it? Yes, because vague estimates are all we have. The true cost in carbon dioxide emissions for every AI prompt — not to mention the groundwater used to cool down thousands of servers crunching those prompts — is still hidden. Researchers can paint a rough picture; Google, Microsoft, OpenAI and others could provide a more precise portrait any time they wanted.
But ever since ChatGPT launched in 2022, "there's been a general crackdown on information," says Sasha Luccioni, a 10-year veteran of AI energy usage research, a TED talk star, and currently climate lead at Hugging Face, a platform for open-source AI.
"Not a single company that offers AI tools, that I know of, provides energy usage and carbon footprint information," Luccioni says in tones of rising frustration. "We don't even know how big models like GPT are. Nothing is divulged, everything is a company secret."
In short: climate-conscious, AI-hungry companies like Google and Microsoft have become a little compartmentalized. They can tell you exactly how many kilograms of carbon your next plane flight is going to emit, but won't offer the same when it comes to your next AI-written term paper or AI-painted Pope in a puffy jacket.
Perhaps with good reason: if we knew the environmental cost of AI products, we'd start shaming each other for our flagrant usage of it.
AI makes us all dirtier
Since tech firms also still care about being seen as good environmental citizens, we do have a sense of the scale of the problem. In its 86-page 2024 sustainability report, Google revealed that its total greenhouse gas emissions shot up by 48 percent between 2019 and 2023, with the bulk of that rise coming since 2022.
Given that Google still aims to get to net zero emissions by 2030, that's not great news. Nor is Microsoft's 2024 sustainability report, which shows a 29.1 percent rise in emissions since 2020.
Both companies point the finger at third parties, specifically the ones building data centers for them. They also point out that these data centers do a lot more than just answer AI prompts, which is true and a big part of why the energy cost of AI is so nebulous.
But neither can the AI-proud companies fully deny what's driving this sudden burst of construction: data centers that are "designed and optimized to support AI workloads," in Microsoft's words.
"We have a long way to go to meet our 2030 target," the Google report admits. Given that data center energy demand is expected to grow 160 percent by 2030, that's an understatement. As a May 2024 Goldman Sachs report estimates: "the carbon dioxide emissions of data centers may more than double between 2022 and 2030."
Where should we point the finger for this rise? As Google's report puts it in this doozy of a passive-voice sentence: "Reducing emissions may be challenging due to increasing energy demands from the greater intensity of AI compute."
To be fair to the owners of power-hungry AI models, their energy usage is probably still dwarfed by other power-hog data center technologies such as cryptocurrency, streaming apps, and online games.
But don't make that comparison to Luccioni. "That always pisses me off," she says, "because AI is not a vertical. It's a horizontal — a tool that gets used across many different verticals. Google Maps uses AI, and so do all the ads we see online, and so does precision agriculture, and so do military drones. How do you calculate what part AI plays?"
Or to put it another way: Google doesn't force you to use cryptocurrency when you do a Google search. But it has put AI search results front and center — and you can't opt out. Which means that even if you think you've never used an AI tool in your life, if you've Googled recently, you're part of the problem. (For the climate-concerned, Luccioni recommends switching to a non-AI search engine like Ecosia.)
If Google, Microsoft, and the other big generative AI players were to reveal all, how bad could it be? Good question. Guesses from experts range from pretty bad to climate disaster.
The International Energy Agency estimates, conservatively, that a single ChatGPT prompt uses nearly 3 watt-hours. Compare that to 0.3 watt-hours for a single Google search (before the company integrated AI results with Gemini, that is).
The power required to reply to hundreds of millions of ChatGPT queries each day could power 33,000 households in the U.S. alone, according to University of Washington researcher Saijad Moazeni. And that doesn't include energy consumed in the process of training every company's AI model in the first place, which is anyone's guess.
AI is incredibly thirsty
Another way to see the scale of the problem: tell-tale spikes in water usage. When OpenAI was in the final month of training its latest model, GPT-4, at a group of Microsoft data centers in West Des Moines Iowa, the company had to pump in 11.5 million gallons — or 6 percent of all the water in the whole district. West Des Moines told Microsoft not to add more data centers unless it could reduce its water usage, echoing a similar problem in Arizona and a 2021 water fight in Oregon over Google data centers.
There is good news, of course. Data center water is increasingly drawn from non-potable sources, and companies are figuring out how to use less of it in the first place. Some data centers are using special HVAC systems, which decrease water usage even as they add to the electricity bill.
But hey, how about the exponential growth in wind and solar power? Surely that can drive our AI revolution, right?
Not so fast, say researchers, who point out that it's impossible to tell whether your AI query is going to a data center in green energy-friendly Europe, coal-friendly India or oil-friendly Saudi Arabia. Even Europe isn't greening its grid fast enough to keep pace with Silicon Valley's AI obsession.
"Renewable energy is definitely growing," Sasha Luccioni says. "The problem is it's not growing fast enough to keep up with AI's growth."
Tech companies are trying to plug that gap with carbon credits, which as a recent Bloomberg investigation points out, isn't the same as taking emissions out of the atmosphere. Microsoft and Amazon rely on credits for more than 50 percent of its so-called renewable energy, the report said.
Meta is a little better, with just 18 percent of its allegedly green energy coming from carbon credits. (Luccioni also credits Meta with being a little better on the AI data disclosure front too, in part because the company currently has less skin in the AI game.)
Can AI help us be more green?
Even if AI-focused data centers were 100 percent powered by wind, solar, hydro and nuclear, that still means they're calling dibs on green power that belongs to all of us.
This isn't a theoretical debate; a conflict over Amazon dropping new data centers next to a 2.5 Gw nuclear power station in Pennsylvania, then fighting locals for the output, appears to be the first in a wave of similar legal battles now ramping up around other nuclear stations.
Are there ways in which using AI is worth such a power suck? Might AI-powered climate research actually help us model extreme weather better, maybe even help us design carbon capture solutions that could scale up fast enough to tackle global warming?
That's a possibility for sure, and one for a future story. But one thing's for sure: few of us, from Gen Z students handing in ChatGPT-written papers to boomers posting AI cat pictures on Facebook, are using AI to save our warming planet. Perhaps we'd be better off leaving this tool to the people who are.