Get the latest tech news
AI could gobble up a quarter of all electricity in the U.S. by 2030 if it doesn’t break its energy addiction, says Arm Holdings exec
‘ChatGPT requires 15 times more energy than a traditional than a traditional web search,’ warns Arm’s Ami Badani at Fortune's Brainstorm AI conference in London.
Right now generative AI have an “insatiable demand” for electricity to power the tens of thousands of compute clusters needed to operate large language models like OpenAI’s GPT-4, warned chief marketing officer Ami Badani from chip design company Arm Holdings. If generative AI is ever going to be able to run on every mobile device from a laptop and tablet to a smartphone, it will have to be able to scale without overwhelming the electricity grid at the same time. It can create super realistic or stylized clips of video footage up to 60 seconds in length purely based on user text prompts.
Or read this on r/technology