Get the latest tech news

AI supercomputers could need as much power as a major city center by 2030, study says


Supercomputers for running and training AI could need the equivalent of nine nuclear reactors of power by the end of the decade, new research said.

For context, today's largest supercomputer — the Colossus system, built to full scale within 214 days by Elon Musk's xAI — is estimated to have cost $7 billion to make, and, per the company's website, is stacked with 200,000 chips. Epoch AI explains this growth by stating that where once supercomputers were used just as research tools, they're now being used as "industrial machines delivering economic value." Earlier this month, President Donald Trump took to Truth Social to celebrate a $500 billion investment from Nvidia to build AI supercomputers in the US.

Get the Android app

Or read this on r/technology

Read more on:

Photo of Study

Study

Photo of power

power

Photo of AI supercomputers

AI supercomputers

Related news:

News photo

The OnePlus 13T launches in China with a smaller display and flagship-level power

News photo

A study of lightning fatalities inside buildings while using smartphones [pdf] (2024)

News photo

Should We Respect LLMs? A Study on Influence of Prompt Politeness on Performance