A version of Samsung Electronics’ fifth-generation high bandwidth memory (HBM) chips, known as HBM3E, has gained Nvidia’s approval for usage in artificial intelligence (AI) processors, according to three persons familiar with the situation.
The qualification removes a significant barrier for the world’s largest memory chipmaker, which has been trying to keep up with local rival SK Hynix in the race to offer sophisticated memory chips capable of handling generative AI work.
Samsung and Nvidia have yet to sign a supply agreement for the authorized eight-layer HBM3E chips, but they plan to do so shortly, according to sources, with supplies expected to begin in the fourth quarter of 2024.
The South Korean technology giant’s 12-layer version of HBM3E chips, however, has yet to pass Nvidia’s tests, according to the sources, who declined to be identified because the topic is classified.
Nvidia has declined to respond.
In a statement to Reuters, Samsung stated that product testing was going as planned, and that it was “in the process of optimising its products through collaboration with various customers.” It did not go into additional detail.
HBM is a type of dynamic random access memory (DRAM) standard introduced in 2013 that stacks chips vertically to save space and minimize power consumption. It is an essential component of graphics processing units (GPUs) for AI, assisting in the processing of massive amounts of data generated by complex applications.
Samsung has been attempting to pass Nvidia’s tests for HBM3E and previous fourth-generation HBM3 models since last year, but has struggled due to heat and power consumption issues, according to Reuters in May, citing sources.
According to insiders familiar with the situation, the business has since revised its HBM3E architecture to solve those difficulties.
Following the release of the Reuters article in May, Samsung denied that their chips failed Nvidia’s tests due to heat and power consumption issues.
“Samsung is still playing catch-up in HBM,” said Dylan Patel, the founder of semiconductor research firm SemiAnalysis.
“While they (will) begin shipping 8-layer HBM3E in Q4, their rival SK Hynix is racing forward shipping (their) 12-layer HBM3E at the same time.”
Samsung Elec shares closed up 3.0% on Wednesday, outperforming the overall market’s 1.8% gain. SK Hynix finished up 3.4%.
The latest test permission comes after Nvidia recently certified Samsung’s HBM3 chips for use in less sophisticated CPUs created for the Chinese market, as reported by Reuters last month.
Nvidia’s certification of Samsung’s latest HBM chips comes amid rising demand for advanced GPUs driven by the generative AI boom, which Nvidia and other AI chipmakers are battling to match.
HBM3E chips are expected to become the market’s primary HBM product this year, with shipments concentrated in the second half, according to research firm TrendForce. SK Hynix, the largest manufacturer, predicts that demand for HBM memory chips in general would expand at an annual pace of 82% through 2027.
Samsung predicted in July that HBM3E chips will account for 60% of its HBM chip sales by the fourth quarter, a target that many analysts believe could be met provided Nvidia’s latest HBM chips received final certification by the third quarter.
Samsung does not reveal revenue breakdowns for individual semiconductor products. According to a Reuters survey of 15 experts, Samsung’s total DRAM chip revenue for the first six months of this year was anticipated to be 22.5 trillion won ($16.4 billion), with some estimating that HBM sales may account for almost 10% of that.
There are only three major manufacturers of HBM: SK Hynix, Micron (MU.O), opens new tab, and Samsung.
SK Hynix has been Nvidia’s primary HBM chip provider, and it shipped HBM3E chips to an unidentified customer in late March. Sources previously stated that shipments went to Nvidia.
Micron has also announced that it will provide Nvidia with HBM3E chips.