While the hype around AI centres on the promise that it will help save the planet, the reality is that AI-driven technology and the industry surrounding it are currently doing just the opposite.
That’s because the complex algorithms and machine learning models powering AI aren’t powered by thin air; they require enormous computational power from thousands of servers housed in data centres around the world.
These data centres, operated by the tech giants pushing AI, consume vast amounts of electricity and water to keep running.
What’s even more alarming is not just that data centres require energy but the sheer amount of energy they consume.
According to researcher Jesse Dodge, ‘one query to ChatGPT uses approximately as much electricity as could light one light bulb for about 20 minutes’. Multiply that by millions of users daily, and the total energy consumption becomes staggeringly high. So this isn’t just a minor hiccup in our fight against climate change – it’s a glaring contradiction.
To put things into perspective, there are over 8,000 data centres worldwide, a number that has nearly doubled since 2015. Collectively, these centres now consume as much electricity as the entire country of Italy. And it doesn’t stop there.
As AI becomes more widespread and AI tools grow more sophisticated, energy demand will only increase. In just three years, AI servers could consume as much energy as all of Sweden or the Netherlands, according to research.
https://lnkd.in/dDYFrGDc
El Tecnologico de la Influencia
3moThanks for sharing🌱🤖🤠