Get the latest tech news
Latest AI training benchmarks show Nvidia has no competition. Across a suite of neural network tasks, competitors' chips didn't even come close to Nvidia's GPUs.
Across a suite of neural network tasks, competitors' chips didn't even come close to Nvidia's GPUs.
Most of MLPerf's tasks are by now well-established neural nets that have been in development for years, such as 3-D U-Net, a program for studying volumetric data for things such as solid tumor detection, which was introduced by Google's DeepMind back in 2016. MLCommonsIn the test to fine-tune Meta's Llama 2 70B, Nvidia took just a minute and a half, with a collection of 1,024 of its "H100" GPU chips, a mainstream part currently powering AI workloads across the industry. An 8-way Nvidia H100 system, aided by two of Advanced Micro Devices' EPYC processor, assembled by open-source vendor Red Hat, took less than half the time, just over 31 minutes.
Or read this on r/technology