AI LLM provider backed by MLPerf cofounder bets barn on mature AMD Instinct MI GPU — but where are the MI300s?

AMD
(Image credit: AMD)

With demand for enterprise-grade large language models (LLMs) surging over the last year or so, Lamini has opened the doors to its LLM Superstation powered by AMD’s Instinct MI GPUs.

The firm claims it’s been running LLMs on over 100 AMD instinct GPUs in secret for the last year in production situations – even before ChatGPT launched. With its LLM Superstation, it’s opening the doors to more potential customers to run their models on its infrastructure.

These platforms are powered by AMD Instinct MI210 and MI250 accelerators, as opposed to the industry-leading Nvidia H100 GPUs which. By opting for its AMD GPUs, Lamini quips, businesses “can stop worrying about the 52-week lead time”. 

AMD vs Nvidia GPUs for LLMs

Although Nvidia’s GPUs – including the H100 and A100 – are those most commonly in use to power LLMs such as ChatGPT, AMD’s own hardware is comparable.

For example, the Instinct MI250 offers up to 362 teraflops of computing power for AI workloads, with the MI250X pushing this do 383 teraflops. The Nvidia A100 GPU, by way of contrast, offers up to 312 teraflops of computing power, according to TechRadar Pro sister site Tom’s Hardware.

"Using Lamini software, ROCm has achieved software parity with CUDA for LLMs,” said Lamini CTO Greg Diamos, who is also the cofounder of MLPerf. “We chose the Instinct MI250 as the foundation for Lamini because it runs the biggest models that our customers demand and integrates finetuning optimizations. 

“We use the large HBM capacity (128GB) on MI250 to run bigger models with lower software complexity than clusters of A100s."

The Lamini LLM Superstation

(Image credit: Lamini)

AMD’s GPUs can, in theory, certainly compete with Nvidia’s. But the real crux is availability, with systems such as Lamini’s LLM Superstation able to offer enterprises the opportunity to take on workloads immediately. 

There’s also the question mark, however, over AMD’s next-in-line GPU, the MI300. Businesses are currently able to sample the MI300A now, while the MI300X is being sampled in the coming months.

According to Tom’s Hardware, the MI300X offers up to 192GB memory, which is double the H100, although we don’t yet fully know what the compute performance looks like. Nevertheless, it’s certainly set to be comparable to the H100. What would give Lamini’s LLM Superstation a real boost is building and offering its infrastructure powered by these next-gen GPUs. 

More from TechRadar Pro

TOPICS
Keumars Afifi-Sabet
Channel Editor (Technology), Live Science

Keumars Afifi-Sabet is the Technology Editor for Live Science. He has written for a variety of publications including ITPro, The Week Digital and ComputerActive. He has worked as a technology journalist for more than five years, having previously held the role of features editor with ITPro. In his previous role, he oversaw the commissioning and publishing of long form in areas including AI, cyber security, cloud computing and digital transformation.

Read more
Project DIGITS - front view
I am thrilled by Nvidia’s cute petaflop mini PC wonder, and it’s time for Jensen’s law: it takes 100 months to get equal AI performance for 1/25th of the cost
The Ryzen AI Max+ 395 could power the latest generation of powerful mini PCs
The AMD Ryzen AI Max+ 395 dominates as the "most powerful" APU on the market, but its competition is questionable
AMD instinct
AMD fast-tracks its most powerful AI GPU ever as it seeks to steal market sharefrom Nvidia's Blackwell B100 and B200
AMD Instinct MI355X Accelerator
AMD signs huge multi-billion dollar deal with Oracle to build a cluster of 30,000 MI355X AI accelerators
Nvidia H800 GPU
A look at the unbelievable Nvidia GPU that powers DeepSeek's AI global ambition
A person holding out their hand with a digital AI symbol.
AI smartphone and laptop sales are said to be slowly dying – but is anyone surprised?
Latest in Pro
Branch office chairs next to a TechRadar-branded badge that reads Big Savings.
This office chair deal wins the Amazon Spring Sale for me and it's so good I don't expect it to last
Saily eSIM by Nord Security
"Much more than just an eSIM service" - I spoke to the CEO of Saily about the future of travel and its impact on secure eSIM technology
NetSuite EVP Evan Goldberg at SuiteConnect London 2025
"It's our job to deliver constant innovation” - NetSuite head on why it wants to be the operating system for your whole business
FlexiSpot office furniture next to a TechRadar-branded badge that reads Big Savings.
Upgrade your home office for under $500 in the Amazon Spring Sale: My top picks and biggest savings
Beelink EQi 12 mini PC
I’ve never seen a PC with an Intel Core i3 CPU, 24GB RAM, 500GB SSD and two Gb LAN ports sell for so cheap
cybersecurity
Chinese government hackers allegedly spent years undetected in foreign phone networks
Latest in News
DeepSeek
Deepseek’s new AI is smarter, faster, cheaper, and a real rival to OpenAI's models
Open AI
OpenAI unveiled image generation for 4o – here's everything you need to know about the ChatGPT upgrade
Apple WWDC 2025 announced
Apple just announced WWDC 2025 starts on June 9, and we'll all be watching the opening event
Hornet swings their weapon in mid air
Hollow Knight: Silksong gets new Steam metadata changes, convincing everyone and their mother that the game is finally releasing this year
OpenAI logo
OpenAI just launched a free ChatGPT bible that will help you master the AI chatbot and Sora
An aerial view of an Instavolt Superhub for charging electric vehicles
Forget gas stations – EV charging Superhubs are using solar power to solve the most annoying thing about electric motoring