Get the latest tech news
AI supercomputers could need as much power as a major city center by 2030, study says
Supercomputers for running and training AI could need the equivalent of nine nuclear reactors of power by the end of the decade, new research said.
For context, today's largest supercomputer — the Colossus system, built to full scale within 214 days by Elon Musk's xAI — is estimated to have cost $7 billion to make, and, per the company's website, is stacked with 200,000 chips. Epoch AI explains this growth by stating that where once supercomputers were used just as research tools, they're now being used as "industrial machines delivering economic value." Earlier this month, President Donald Trump took to Truth Social to celebrate a $500 billion investment from Nvidia to build AI supercomputers in the US.
Or read this on r/technology