Get the latest tech news

The von Neumann bottleneck is impeding AI computing?


The von Neumann architecture, which separates compute and memory, is perfect for conventional computing. But it creates a data traffic jam for AI.

But for AI computing, whose operations are simple, numerous, and highly predictable, a conventional processor ends up working below its full capacity while it waits for model weights to be shuttled back and forth from memory. About a decade ago, the von Neumann bottleneck wasn’t a significant issue because processors and memory weren’t so efficient, at least compared to the energy that was spent to transfer data, said Le Gallo-Bourdeau. This module brings the speed and bandwidth density of fiber optics to the edge of chips, supercharging their connectivity and hugely reducing model training time and energy costs.

Get the Android app

Or read this on Hacker News

Read more on:

Photo of power

power

Photo of decades

decades

Photo of AI computing

AI computing

Related news:

News photo

New Quasi-Moon Discovered Orbiting Earth, but It's Been Around for Decades

News photo

DARPA wants AI to know when it's being an energy hog

News photo

The most powerful Raspberry Pi just launched with serious power and an RGB keyboard