Samsung's new HBM3E memory technology hits 1.2TB/s — you can bet that it will power Nvidia's next AI monster GPU, the GH200

Samsung’s latest memory technology has hit staggering speeds of 9.8Gb/s – or 1.2TB/s – meaning it’s more than 50% faster than its predecessor. 

The HBM3E memory standard, nicknamed Shinebolt, is the latest in a series of high-performance memory units Samsung has developed for the age of cloud computing and increased demand for resources.

Shinebolt is a successor to Icebolt, which comes in varieties up to 32GB, can reach speeds of up to 6.4Gb/s. These chips were designed specifically to be used with the best GPUs out there in AI processing and LLMs, with the firm ramping up production this year as the emerging industry gathers momentum.

Powering the next generation of AI chips

HBM3E will inevitably find its way into the components developed by the likes of Nvidia, with every suggestion it can find its way into the GH200, nicknamed Grace Hopper, in light of a recent deal it struck.

High-bandwidth memory (HBM) is much faster and more energy efficient than conventional RAM, and uses 3D stacking technology which lets the layers of chips to be stacked on top of each other.

Samsung’s HBM3E stacks layers higher than in previous iterations through the use of non-conductive film (NCF) technology, which eliminates gaps between the layers in the chips. Thermal conductivity is maximized, and it can ultimately hit much higher speeds and efficiency as a result. 

The unit will power the next generation of AI applications, Samsung claims, as it’ll speed up AI training and inference in data centers and improve the total cost of ownership (TCO). 

Even more exciting is the prospect that it’ll be included in Nvidia’s next-gen AI chip, the H200. The two companies struck an agreement in September in which Samsung would supply the chipmaker with HBM3 memory units, according to the Korea Economic Daily, with Samsung set to supply roughly 30% of Nvidia’s memory by 2024. 

Should this partnership continue, there’s every possibility HBM3E components will become part of this deal once they enter mass production. 

More from TechRadar Pro

TOPICS
Keumars Afifi-Sabet
Channel Editor (Technology), Live Science

Keumars Afifi-Sabet is the Technology Editor for Live Science. He has written for a variety of publications including ITPro, The Week Digital and ComputerActive. He has worked as a technology journalist for more than five years, having previously held the role of features editor with ITPro. In his previous role, he oversaw the commissioning and publishing of long form in areas including AI, cyber security, cloud computing and digital transformation.