Want your server to access more than 16,000 DIMM slots in one go? This Korean startup claims that its CXL 3.1-based technology can help you scale to more than 16PB of RAM — but it will cost nearly $1 billion

V-color RGB DDR5 O CUDIMM memory sticks lying flat on a surface
(Image credit: V-color)

Ever imagined drawing on up to 16 petabytes of RAM? Well, this startup could be the key to unlocking groundbreaking memory capabilities.

Korean fabless startup Panmnesia unveiled what it described as the world’s first CXL-enabled AI cluster featuring 3.1 switches during the recent 2024 OCP Global Summit.

The solution, according to Panmnesia, has the potential to markedly improve the cost-effectiveness of AI data centers by harnessing Compute Express Link (CXL) technology.

Scalable - but costly

In an announcement, the startup revealed the CXL-enabled AI cluster will be integrated within its main products, the CXL 3.1 switch and CXL 3.1 IP, both of which support the connections between the CXL memory nodes and GPU nodes responsible for storing large data sets and accelerating machine learning.

Essentially, this will enable enterprises to expand memory capacities by equipping additional memory and CXL devices without having to purchase costly server components.

The cluster can also be scaled to data center levels, the company said, thereby reducing overall costs. The solution also supports connectivity between different types of CXL devices and is able to connect hundreds of devices within a single system.

The cost of such an endeavor could be untenable

While drawing upon 16PB of RAM may seem like overkill, in the age of increasingly cumbersome AI workloads, it’s not exactly out of the question.

In 2023, Samsung revealed it planned to use its 32GB DDR5 DRAM memory die to create a whopping 1TB DRAM module. The motivation behind this move was to help contend with increasingly large AI workloads.

While Samsung is yet to provide a development update, we do know the largest RAM units Samsung has previously produced were 512GB in size.

First unveiled in 2021, these were aimed for use in next-generation servers powered by top of the range CPUs (at least by 2021 standards - including the AMD EPYC Genoa CPU and Intel Xeon Scalable ‘Sapphire Rapids’ processors.

This is where cost could be a major inhibiting factor with the Panmnesia cluster, however. Pricing on comparable products, such as the Dell 370-AHHL memory modules at 512GB, currently stands at just under $2,400.

That would require significant investment from an enterprise by any standard. If one were to harness Samsung’s top end 1TB DRAM module, the costs would simply skyrocket given their expected price last year stood at around $15,000.

More from TechRadar Pro

News and Analysis Editor, ITPro

Ross Kelly is News & Analysis Editor at ITPro, responsible for leading the brand's news output and in-depth reporting on the latest stories from across the business technology landscape.