Get the latest tech news

Latest AI training benchmarks show Nvidia has no competition. Across a suite of neural network tasks, competitors' chips didn't even come close to Nvidia's GPUs.


Across a suite of neural network tasks, competitors' chips didn't even come close to Nvidia's GPUs.

Most of MLPerf's tasks are by now well-established neural nets that have been in development for years, such as 3-D U-Net, a program for studying volumetric data for things such as solid tumor detection, which was introduced by Google's DeepMind back in 2016. MLCommonsIn the test to fine-tune Meta's Llama 2 70B, Nvidia took just a minute and a half, with a collection of 1,024 of its "H100" GPU chips, a mainstream part currently powering AI workloads across the industry. An 8-way Nvidia H100 system, aided by two of Advanced Micro Devices' EPYC processor, assembled by open-source vendor Red Hat, took less than half the time, just over 31 minutes.

Get the Android app

Or read this on r/technology

Read more on:

Photo of Nvidia

Nvidia

Photo of Chips

Chips

Photo of Suite

Suite

Related news:

News photo

AMD's MI300X Outperforms Nvidia's H100 for LLM Inference

News photo

OpenAI to use Oracle’s chips for more AI compute

News photo

Nvidia CEO Jensen Huang has a no one-on-one meetings rule for his 55 direct reports