Get the latest tech news

Training AI models might not need enormous data centres


Eventually, models could be trained without any dedicated hardware at all

Once, the world’s richest men competed over yachts, jets and private islands. Just 18 months ago, OpenAI trained GPT-4, its then state-of-the-art large language model (LLM), on a network of around 25,000 then state-of-the-art graphics processing units (GPUs) made by Nvidia. This article appeared in the Science & technology section of the print edition under the headline “I can do it with a distributed heart”

Get the Android app

Or read this on Hacker News

Read more on:

Photo of Training AI models

Training AI models