AMD adds support for super-popular open source AI tool to its most powerful GPU – now imagine what would happen if an AMD APU could run Stable Diffusion out of the box
AMD is taking the fight to Nvidia's with open source
AMD is hoping to make AI more accessible to developers and researchers by adding support for PyTorch to its Radeon RX 7900 XTX and Radeon Pro W7900 graphics cards.
Based on its RDNA 3 GPU architecture, these are some of the best GPUs out there, and they can now let users establish a private and cost-effective workflow for machine learning training and inference. Previously, users may have needed to rely on cloud access to compatible GPUs for such AI workloads.
“We are excited to offer the AI community new support for machine learning development using PyTorch built on the AMD Radeon RX 7900 XTX and Radeon Pro W7900 GPUs and the ROCm open software platform,” said Dan Wood, VP of Radeon product management. “This is our first RDNA 3 architecture-based implementation, and we are looking forward to partnering with the community.”
Making the most of ROCm
With the PyTorch machine learning framework now supported on its most powerful graphics cards, AMD is hoping to crack open access to AI workloads for users who don’t have the means or infrastructure that you’d otherwise need.
Anybody hoping to take advantage of PyTorch can also use the Radeon Open Compute (ROCm) software stack for GPUs – which spans general-purpose computing, high-performance computing (HPC), as well as heterogeneous computing.
With AMD ROCm 5.7, users of machines with RDNA 3-based GPUs, as well as CDNA GPU and AMD Instinct MI series accelerators, can also use PyTorch.
Because ROCm is open source, developers may wish to take in all kinds of different directions and add support for their own particular AI processing needs. There is, for example, a huge amount of appetite out there to get Stable Diffusion running on AMD accelerated processing units (APUs).
Are you a pro? Subscribe to our newsletter
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
One user, for example, turned a 4600G APU into a 16GB VRAM GPU that could run AI workloads – including on Stable Diffusion – without too much of a hitch, according to a video they posted on YouTube.
More from TechRadar Pro
- These are the best AMD graphics cards for all purposes
- Check out the best graphics cards for all budgets
- These are the best 4K graphics cards for gamers and creatives
Keumars Afifi-Sabet is the Technology Editor for Live Science. He has written for a variety of publications including ITPro, The Week Digital and ComputerActive. He has worked as a technology journalist for more than five years, having previously held the role of features editor with ITPro. In his previous role, he oversaw the commissioning and publishing of long form in areas including AI, cyber security, cloud computing and digital transformation.