0
Micron Sells Out Entire HBM3E Supply for 2024, Most of 2025

Micron Sells Out Entire HBM3E Supply for 2024, Most of 2025

Being the first company to ship HBM3E memory has advantages for Micron, as the company revealed that it has managed to sell its entire supply of advanced high-bandwidth memory for 2024, while Most of their 2025 production is earmarked, as is Micron's HBM3E The memory (or as Micron alternatively calls it, HBM3 Gen2) was one of the first to qualify for NVIDIA's updated H200/GH200 accelerator, so it looks like the DRAM maker will be a key supplier to the green company. Will be.

“Our HBM calendar is sold out for 2024, and the vast majority of our 2025 supply has already been allocated,” said Micron Chief Executive Sanjay Mehrotra said in prepared remarks for the company's earnings call this week. “We continue to expect HBM bitshare to equal our overall DRAM bitshare sometime in calendar 2025.”

Micron's first HBM3E product is an 8-Hi 24 GB stack with a 1024-bit interface, a data transfer rate of 9.2 GT/s, and a total bandwidth of 1.2 TB/s. NVIDIA's H200 accelerator for artificial intelligence and high-performance computing will use six of these cubes, giving a total of 141 GB of accessible high-bandwidth memory.

“We are on track to receive several hundred million dollars in revenue from HBM in fiscal 2024 and expect HBM revenue to be in line with our DRAM and overall gross margin beginning in the fiscal third quarter,” Mehrotra said.

The company has also started sampling its 12-Hi 36 GB stacks that offer 50% more capacity. These KGSDs will ramp up in 2025 and will be used for the next generations of AI products. Meanwhile, it doesn't look like NVIDIA's B100 and B200 are going to use the 36 GB HBM3E stack, at least initially.

Demand for artificial intelligence servers set records last year, and it looks set to remain high this year as well. Some analysts believe that NVIDIA's A100 and H100 processors (as well as their various derivatives) will control up to 80% of the entire AI processor market in 2023. And while this year NVIDIA will face tough competition from AMD, AWS, D-Matrix, Regarding Intel, Tenstorrent, and other companies, it looks like NVIDIA's H200 will still be the processor of choice for AI training, especially for big players like Meta and Microsoft, which already have millions of NVIDIA accelerators. Operate fleet consisting of With that in mind, being the primary supplier of HBM3E for NVIDIA's H200 is a big deal for Micron as it finally enables it to capture a larger share of the HBM market, currently held by SK Hynix and Samsung dominate, and where only Micron controls. About 10% as last year.

Meanwhile, because each DRAM device inside the HBM stack has a wide interface, it is physically larger than regular DDR4 or DDR5 ICs. As a result, the HBM3E memory ramp will affect the bit supply of micron-to-commodity DRAMs, the company said.

“Ramp of HBM production will curb supply growth in non-HBM products,” Mehrotra said. “Industry-wide, HBM3E uses nearly three times the wafer supply as DDR5 to produce a given number of bits in the same technology node.”

About the Author

Leave a Reply