Samsung’s AI memory chips pass Nvidia tests
While a supply deal is yet to be signed, it is expected soon, with shipments anticipated by Q4 2024. Samsung’s redesigned HBM3E chips aim to address previous heat and power issues.
Samsung Electronics’ fifth-generation high bandwidth memory (HBM) chips, or HBM3E, have passed Nvidia’s tests for use in its AI processors. This development marks an important milestone for Samsung as it aims to compete with local rival SK Hynix in the advanced memory chip market. While a supply deal for the approved eight-layer HBM3E chips is expected soon, Samsung’s 12-layer version has yet to pass Nvidia’s tests.
HBM chips, first introduced in 2013, are dynamic random access memory (DRAM) designed to save space and reduce power consumption. They are crucial for AI graphics processing units (GPUs) as they handle large volumes of data. Since last year, Samsung has been working to address heat and power consumption issues in its HBM3E design to meet Nvidia’s standards.
Despite Samsung’s recent progress, it still trails behind SK Hynix, which is already shipping 12-layer HBM3E chips. Samsung’s shares rose 4.3% on Wednesday, surpassing a 2.4% increase in the broader market. Nvidia’s approval of Samsung’s HBM chips comes amid growing demand for GPUs driven by the generative AI boom, with HBM3E chips expected to become mainstream this year.
Research firm TrendForce predicts HBM3E chips will dominate the market, with SK Hynix estimating an annual growth rate of 82% in HBM chip demand through 2027. Samsung aims for HBM3E chips to account for 60% of its HBM sales by the fourth quarter, a target analysts believe is achievable if Nvidia’s final approval is secured. SK Hynix, Micron, and Samsung primarily dominate the HBM market.