X

AI systems already consume more power than small countries

Featured image for AI systems already consume more power than small countries

Artificial Intelligence (AI) is quickly transforming our everyday lives, whether by changing the way we process information or by providing people with access to professional tools in their homes. However, this increased use of AI comes at a significant cost – electricity consumption. While many view AI as just another computer program, it is, in reality, a complex set of programs that goes through every possible combination. Now, a new report from Schneider Electric (via Techradar) suggests that Artificial Intelligence (AI) systems already consume 4.3 gigawatts (GW) of power worldwide, similar to the energy consumption of a small country.

As per the study, despite generative AI being around for less than a year, its power consumption is rapidly increasing due to the rising demand. Additionally, with an annual growth rate of 26-36%, AI could potentially account for a remarkable 13.5 to 20 GW of global electricity usage by 2028. Moreover, data centers as a whole may consume up to 90 GW of power by 2028.

Training and workloads

At present, there is a 20:80 split between AI training and workloads. And although training is more power-intensive, workloads will eventually surpass them as companies settle with their AI models. However, this shift makes predicting future energy consumption challenging, as both tasks have varying power requirements.

Additionally, there is one more issue that has always been apparent with data centers, and that is cooling. This is because running these AI models generates a significant amount of heat, posing safety hazards and the risk of premature component failure. As a result, cooling systems, while essential, are power-intensive and often rely on substantial water usage. This heavy reliance on water has drawn the attention of experts, as data centers often reroute and modify water sources to meet their cooling needs.

Catering to growing power consumption

Although there are currently no solutions to the cooling challenges, the report recommends that data centers transition from conventional 120/208V distribution to the more robust 240/415V, which can better accommodate the high power consumption associated with AI systems.

  翻译: