New MU report
> Yonhap reported Samsung plans to start mass production of HBM4 as early as this month for Nvidia’s next-gen AI platform Vera Rubin.
>
>The report says Samsung has cleared Nvidia qualification and secured orders, while Micron has guided to ramp HBM4 in Q2 2026.
Samsung will reportedly begin mass production of [sixth-generation high-bandwidth memory](https://www.sammobile.com/news/samsung-pushes-hbm3e-chips-ai-servers-while-improving-hbm4-yield/) (HBM4) chips this month. These advanced semiconductor memory chips are said to be used in Nvidia’s next-generation AI accelerator system, called Vera Rubin, which is expected to launch later this year.
A [report](https://en.yna.co.kr/view/AEN20260208001100320) from South Korea claims that Samsung could start shipping HBM4 chips to Nvidia as early as next week, coinciding with the Lunar New Year holiday. This [lines up with an earlier report](https://www.sammobile.com/news/samsung-hbm4-mass-production-start-next-month-nvidia/) about the mass production schedule. Those chips will be used in Nvidia’s Rubin GPUs, which are designed for generative AI servers and hyperscalers. The company has recently invested a lot of resources to [boost its HBM4 production capacity](https://www.sammobile.com/news/samsung-increase-hbm-production-capacity-drastically/).
Current-generation AI accelerators, such as the AMD MI350 and Nvidia B200, use fifth-generation high-bandwidth memory (HBM3E) chips. Micron and SK Hynix are the top two providers of HBM3E chips, while Samsung trails those firms. However, in the HBM4 segment, Samsung is expected to become the largest supplier.
Confirmation of news that nvda going with Samsung and sk hynix and leaving out mu. Tho they were previously only gona get 5-10% of their Vera rubin hbm4 from mu anyway. Supposedly over performance issues.