World's biggest RAM vendors develop superior memory form factor exclusively for Nvidia, sorry Intel and AMD
Micron and SK Hynix are producing SOCAMM modules for Nvidia's AI platform

- SOCAMM is a new modular memory form factor exclusive to Nvidia systems
- Micron says SOCAMM offers high bandwidth, low power and smaller footprint
- SK Hynix plans production of SOCAMM as AI infrastructure demand grows
At the recent Nvidia GTC 2025, memory makers Micron and SK Hynix took the wraps off their respective SOCAMM solutions.
This new modular memory form factor is designed to unlock the full potential of AI platforms and has been developed exclusively for Nvidia’s Grace Blackwell platform.
SOCAMM, or Small Outline Compression Attached Memory Module, is based on LPDDR5X and intended to address growing performance and efficiency demands in AI servers. The form factor reportedly offers higher bandwidth, lower power consumption, and a smaller footprint compared to traditional memory modules such as RDIMMs and MRDIMMs. SOCAMM is specific to Nvidia’s AI architecture and so can’t be used in AMD or Intel systems.
More cost-efficient
Micron announced it will be the first to ship SOCAMM products in volume and its 128GB SOCAMM modules are designed for the Nvidia GB300 Grace Blackwell Ultra Superchip.
According to the company, the modules deliver more than 2.5 times the bandwidth of RDIMMs while using one-third the power.
The compact 14x90mm design is intended to support efficient server layouts and thermal management.
“AI is driving a paradigm shift in computing, and memory is at the heart of this evolution,” said Raj Narasimhan, senior vice president and general manager of Micron’s Compute and Networking Business Unit.
Are you a pro? Subscribe to our newsletter
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
“Micron’s contributions to the Nvidia Grace Blackwell platform yield performance and power-saving benefits for AI training and inference applications.”
SK Hynix also presented its own low-power SOCAMM solution at GTC 2025 as part of a broader AI memory portfolio.
Unlike Micron, the company didn’t go into too much detail about it, but said it is positioning SOCAMM as a key offering for future AI infrastructure and plans to begin mass production “in line with the market’s emergence”.
“We are proud to present our line-up of industry-leading products at GTC 2025,” SK Hynix's President & Head of AI Infra Juseon (Justin) Kim said.
“With a differentiated competitiveness in the AI memory space, we are on track to bring our future as the Full Stack AI Memory Provider forward.”
You might also like
Wayne Williams is a freelancer writing news for TechRadar Pro. He has been writing about computers, technology, and the web for 30 years. In that time he wrote for most of the UK’s PC magazines, and launched, edited and published a number of them too.
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.

















