The Electrical Grid is Suffering: Not Because of Electric Cars, but Rather Because of the Huge AI Demand

  • AI interference consumes 33 times more energy than machines running a traditional algorithm.

  • Electricity demand from data centers is set to increase sixfold over the next 10 years.

Grid
No comments Twitter Flipboard E-mail

When users send a request to an artificial intelligence chatbot like ChatGPT, servers in a data center compute the response. This is similar to any cloud service, but the generative AI models, which have been trained to predict responses, use 33 times more energy than machines running a traditional algorithm to perform specific tasks.

“It’s wildly inefficient from a computational perspective,” Sasha Luccioni, a researcher at Hugging Face, told BBC News.

The Warning From AI Ethicists

A pre-published study by Luccioni and other authors warns of the high environmental cost of large AI models, both in terms of the amount of energy they demand and the tons of carbon they emit.

When companies like Meta, OpenAI, and Google train their large language models (Llama 3, GPT-4o, and Gemini 1.5 Ultra, respectively), they run hundreds of thousands of graphics cards or TPUs that consume huge amounts of electricity, especially as they grow in size (from millions to hundreds of billions of parameters).

1,000 Terawatt-Hours in Data Centers

However, this isn't the whole story. Generative AI is becoming increasingly popular, which means that millions of people are making queries every second. This sets the AI interference wheels turning, that is, the process by which machine learning models make predictions or generalizations from any query.

The process is resource-intensive because it requires data centers worldwide to generate content from scratch. As a result, the electricity consumption of these data centers is projected to increase from 460 TWh in 2022 to over 1,000 TWh by 2026, equivalent to the electricity consumption of Japan, which has a population of 125 million.

Executives at National Grid, the UK grid operator, are concerned about the six-fold increase in electricity demand from data centers they’ve predicted to occur over the next decade due to the use of AI. This is in addition to the growing demand from the electrification of transportation and heating.

In the U.S., grid operators are also feeling the impact of the demand from data centers: “They’re getting hit with data center demands at the exact same time as we have a renaissance taking place–thanks to the government policy–in domestic manufacturing,” a consultant at Wood Mackenzie says.

Flirting with Nuclear Power

New small models like Phi-3 or Gemini Nano, which run directly on our devices, can help solve some of the issues. Meanwhile, hardware performance is getting better, which saves energy in both training and inference.

However, as hardware improves, big tech companies are also competing to train larger, more capable models. This requires more data centers to store the training data and more energy to power all the computing.

It’s like a classic catch-22. While data centers often have self-consuming facilities, this situation will require more drastic solutions. This is why companies like Microsoft are beginning to invest in modular nuclear reactors, while OpenAI CEO Sam Altman is focusing on nuclear fusion.

Image | Microsoft

Related | Elon Musk Says Tesla Model Y Has More Range Than Announced. To Access It, You’ll Have to Pay Up

Home o Index
  翻译: