Get the latest tech news
The Environmental Toll of a Single ChatGPT Query Is Absolutely Wild
Queries to OpenAI's ChatGPT eat up a surprising amount of water and electricity to cool data centers and run calculations.
The answer is about equivalent of a full bottle of water and enough power to light up 14 LED bulbs for one hour, according to The Washington Post's consultation with UC Riverside researcher Shaolei Ren — an appreciable environmental toll on its own, but a staggering one when you multiply it out to the number of users worldwide. Places like Arizona and Iowa are already feeling the tension between serving the needs of the public and the insatiable water thirst and power hunger of AI data centers, which bring tax revenue and jobs to these locales. But something has got to give, especially as big tech companies like Google and Microsoft, a close business partner of OpenAI, report that they're using more resources than ever despite pledges to carbon neutrality.
Or read this on Hacker News