Get the latest tech news
Micron sells out its entire supply of high-bandwidth HBM3E memory for AI accelerators | Sold out for all of 2024 and most of 2025
Micron is reaping the benefits of being the first out of the gate with HBM3E memory (HBM3 Gen 2 in Micron-speak), with much of it being used...
At its Q2 2024 earnings call, Micron Technology CEO Sanjay Mehrotra announced that the company has sold out its entire supply of high-bandwidth HBM3E memory for all of 2024. It is expected to be used extensively by tech giants Meta and Microsoft, which have already deployed hundreds of thousands of AI accelerators from Nvidia. As noted by Tom's Hardware, Micron's first HBM3E products are 24GB 8-Hi stacks with 1024-bit interface, 9.2 GT/s data transfer rate and a 1.2 TB/s peak bandwidth.
Or read this on r/technology