CIO Insider

CIOInsider India Magazine

Separator

Samsung's 8-Layer HBM3E Chip clears Nvidia Use Test

CIO Insider Team | Wednesday, 7 August, 2024
Separator

As per reports, a version of Samsung Electronics' fifth-generation high bandwidth memory (HBM) chips, or HBM3E, has passed Nvidia's tests for use in its artificial intelligence (AI) processors.

This qualification clears a major hurdle for the world's largest memory chip maker, which has struggled to keep up with local rival SK Hynix to supply advanced memory chips that can handle generative AI work.

Samsung and Nvidia have not yet signed a supply agreement for an approved 8-layer HBM3E chip, but if they do so soon, they add that they expect supply to start by Q4 2024.

However, the Korean technology giant's 12-tier version of the HBM3E chip has not yet passed Nvidia's test.

HBM is a type of dynamic random access memory or DRAM standard, first manufactured in 2013, that allows chips to be stacked vertically to save space and reduce power consumption. It is an important component of the GPU (graphics processing unit) for AI and helps to process large amounts of data generated by complex applications.

Samsung's total DRAM chip sales were estimated to be 22.5 trillion with some suggesting that about 10 percent of that could be due to HBM's sales

Samsung has been trying to pass Nvidia's tests for Hbm3E and the 4th generation HBM3 model that preceded it since last year but is struggling due to heat and power consumption issues, citing sources reported on May 5. The company then reviewed the HBM3E design to address these issues.

Nvidia's approval of Samsung's latest HBM chip comes amid a surge in demand for sophisticated GPUs created by the generation AI boom that Nvidia and other AI chipset manufacturers are struggling to cope with.

Samsung does not provide a breakdown of revenue for certain chip products. Samsung's total DRAM chip sales were estimated to be 22.5 trillion with some suggesting that about 10 percent of that could be due to HBM's sales.



Current Issue
AI Use Cases To Watch For In 2025



🍪 Do you like Cookies?

We use cookies to ensure you get the best experience on our website. Read more...