Here's why 100TB+ SSDs will play a huge role in ultra large language models in the near future
Kioxia's AiSAQ solution uses high capacity drives to handle large datasets

- Kioxia reveals new project called AiSAQ which wants to substitute RAM with SSDs for AI data processing
- Bigger (read: 100TB+) SSDs could improve RAG at a lower cost than using memory only
- No timeline has been given, but expect Kioxia's rivals to offer similar tech
Large language models often generate plausible but factually incorrect outputs - in other words, they make stuff up. These "hallucination"s can damage reliability in information-critical tasks such as medical diagnosis, legal analysis, financial reporting, and scientific research.
Retrieval-Augmented Generation (RAG) mitigates this issue by integrating external data sources, allowing LLMs to access real-time information during generation, reducing errors, and, by grounding outputs in current data, improving contextual accuracy. Implementing RAG effectively requires substantial memory and storage resources, and this is particularly true for large-scale vector data and indices. Traditionally, this data has been stored in DRAM, which, while fast, is both expensive and limited in capacity.
To address these challenges, ServeTheHome reports that at this year’s CES, Japanese memory giant Kioxia introduced AiSAQ - All-in-Storage Approximate Nearest Neighbor Search (ANNS) with Product Quantization - that uses high-capacity SSDs to store vector data and indices. Kioxia claims AiSAQ significantly reduces DRAM usage compared to DiskANN, offering a more cost-effective and scalable approach for supporting large AI models.
More accessible and cost-effective
Shifting to SSD-based storage allows for the handling of larger datasets without the high costs associated with extensive DRAM use.
While accessing data from SSDs may introduce slight latency compared to DRAM, the trade-off includes lower system costs and improved scalability, which can support better model performance and accuracy as larger datasets provide a richer foundation for learning and inference.
By using high-capacity SSDs, AiSAQ addresses the storage demands of RAG while contributing to the broader goal of making advanced AI technologies more accessible and cost-effective. Kioxia hasn't revealed when it plans to bring AiSAQ to market, but its safe to bet rivals like Micron and SK Hynix will have something similar in the works.
ServeTheHome concludes, “Everything is AI these days, and Kioxia is pushing this as well. Realistically, RAG is going to be an important part of many applications, and if there is an application that needs to access lots of data, but it is not used as frequently, this would be a great opportunity for something like Kioxia AiSAQ.”
Are you a pro? Subscribe to our newsletter
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
More from TechRadar Pro
- These are the best SSDs you can buy right now
- And these are the best Large Language Models (LLMs)
- 1200TB SSD modules are in the pipeline thanks to Pure Storage
Wayne Williams is a freelancer writing news for TechRadar Pro. He has been writing about computers, technology, and the web for 30 years. In that time he wrote for most of the UK’s PC magazines, and launched, edited and published a number of them too.
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.