Get the latest tech news
AI Is Eating Data Center Power Demand—and It’s Only Getting Worse
A new analysis of AI hardware being produced and how it is being used attempts to estimate the vast amount of electricity being consumed by AI.
Looking at AI, he says, has grown more urgent over the past few years because of the widespread adoption of ChatGPT and other large language models that use massive amounts of energy. Some attempts to quantify AI’s energy consumption have started from the user side: calculating the amount of electricity that goes into a single ChatGPT search, for instance. Sasha Luccioni, an AI and energy researcher and the climate lead at open-source machine-learning platform Hugging Face, cautioned about leaning too hard on some of the conclusions of the new paper, given the amount of unknowns at play.
Or read this on Wired