Get the latest tech news

Micron sells out its entire supply of high-bandwidth HBM3E memory for AI accelerators | Sold out for all of 2024 and most of 2025


Micron is reaping the benefits of being the first out of the gate with HBM3E memory (HBM3 Gen 2 in Micron-speak), with much of it being used...

At its Q2 2024 earnings call, Micron Technology CEO Sanjay Mehrotra announced that the company has sold out its entire supply of high-bandwidth HBM3E memory for all of 2024. It is expected to be used extensively by tech giants Meta and Microsoft, which have already deployed hundreds of thousands of AI accelerators from Nvidia. As noted by Tom's Hardware, Micron's first HBM3E products are 24GB 8-Hi stacks with 1024-bit interface, 9.2 GT/s data transfer rate and a 1.2 TB/s peak bandwidth.

Get the Android app

Or read this on r/technology

Read more on:

Photo of Entire Supply

Entire Supply

Photo of Micron

Micron

Photo of AI accelerators

AI accelerators

Related news:

News photo

Micron Samples 256 GB DDR5-8800 MCR DIMMs: Massive Modules for Massive Servers

News photo

Micron Sells Out Entire HBM3E Supply for 2024, Most of 2025

News photo

Micron's Earnings and Reddit's IPO | Bloomberg Technology