AI could help turbocharge your SSD’s effective storage by compressing data even further — but just don’t delete 7zip just yet
Researchers trained Chinchilla 70B to compress images and audio files far more effectively than conventional algorithms
DeepMind scientists have treated compression technology to a major upgrade thanks to a large language model (LLM) that has achieved astonishing lossless compression rates with image and audio data.
Thanks to the company’s Chinchilla 70B LLM, the researchers used a special compression algorithm to reduce images to 43.4% and audio files to 16.4% of their original sizes, as detailed in their paper - making it better than some of the best compression software out there.
By contrast, standard image compression algorithm PNG reduces images to 58.5% and FLAC compressors shrink audio to 30.3% of their original file sizes. It means storing so much more on any one of the best SSDs.
Although Chinchilla 70B is trained mainly on text, they achieved these results by leaning on the predictive capabilities of the model, and framed the “prediction problem” through the lens of file compression. In other words, they retooled the best qualities of an LLM and found these traits also serve to compress large files.
AI is great at compression – up to a point
The DeepMind researchers showed that due to this equivalence between prediction and compression, any compressor can be used as a conditional generative model – and even the other way around.
But, they added, they can only achieve such compression results up to a certain file size, meaning using generative AI as a compression solution may not be practical for everyone.
“We evaluated large pretrained models used as compressors against various standard compressors, and showed they are competitive not only on text but also on modalities they have never been trained on,” the researchers noted.
Are you a pro? Subscribe to our newsletter
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
“We showed that the compression viewpoint provides novel insights on scaling laws since it takes the model size into account, unlike the log-loss objective, which is standard in current language modeling research.”
Due to this scaling limitation, the models used in this research aren’t better than, say, the likes of 7zip when you’re looking at files above a certain threshold. They may not compress as impressively as the results show, and they may also not be as fast as conventional compression algorithms.
More from TechRadar Pro
- We've rounded up the best AI tools you can try today
- Check out the best ways to send big files right now
- These are the best free WinZip alternatives out there
Keumars Afifi-Sabet is the Technology Editor for Live Science. He has written for a variety of publications including ITPro, The Week Digital and ComputerActive. He has worked as a technology journalist for more than five years, having previously held the role of features editor with ITPro. In his previous role, he oversaw the commissioning and publishing of long form in areas including AI, cyber security, cloud computing and digital transformation.