In the AI era, the edge is the new cloud
AI is driving computing towards the edge, says Qualcomm
Over the last decade or so, businesses have migrated more and more workloads away from on-premise servers and to the cloud, in an effort to capitalize on the flexibility and cost savings on offer.
As a result, the global cloud computing market is set to be worth upwards of $250 billion this year, a large proportion of which will fall into the pockets of hyperscalers such as Amazon Web Services, Microsoft Azure and Google Cloud.
However, various signs suggest the tide is beginning to shift in a different direction, with a larger proportion of computing taking place outside centralized datacenters once again.
According to Mike Vildibill, VP & GM of Cloud Edge AI at semiconductor company Qualcomm, the rise of artificial intelligence (AI) will combine with a number of other factors to push computing back towards the edge of the network, where latency is just as important as raw performance.
“The next mega-trend is now underway,” he told TechRadar Pro. “Previously, we saw a lot of computation moving to the cloud, but a yo-yo effect is creating a need for computation closer to the edge, where both the data and consumers of the data reside.”
“There’s still a need for a centralized cloud, but even the hyperscalers recognize that the cloud is coming to the edge. Instead of residing in some far-flung datacenter, it might be in the trunk of your car, at an intersection, or bolted to the side of a building. That’s the future.”
A new direction
Although Qualcomm made its name in the mobile computing space with its Snapdragon line of chips, which continue to compete at the top of the market, the company recently launched a new line of business that is quickly gaining momentum.
Are you a pro? Subscribe to our newsletter
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
The focus is on building high-performance server chips specifically designed to accelerate AI inference, both in the cloud and at the edge. Manufactured on a 7nm process, the company’s latest Cloud AI 100 accelerators lead the market in both performance density and energy efficiency, per MLPerf benchmarks.
For example, Qualcomm’s Cloud AI 100 Edge Development Kit (AEDK) was found to achieve 240 inferences per second per watt (inf/sec/watt) for ResNet-50, a neural network commonly used to benchmark inference performance. For comparison, the AGX Xavier from Nvidia managed 60 inf/sec/watt, four times fewer.
While the company is working with customers to accelerate inference in a datacenter setting with its Cloud AI 100 platform, Vildibill is most enthusiastic about new opportunities at the edge.
The “poster-child” use case for edge computing, he explained, is autonomous driving, whereby a car performs inference on the data pulled from various cameras and sensors to plot a route without the input of a driver.
If an obstruction suddenly appears on the road (say, a child walks out from behind a parked car), a course correction needs to be calculated almost instantaneously, in such a way that only edge computing makes possible.
“The laws of physics dictate that data cannot move quickly enough between the car and a cloud datacenter and back again in sufficient time for disaster to be averted,” said Vildibill. “You need to do the processing closer to where the data resides.”
And this is just one of many examples; Qualcomm says its customers are finding various new use cases for inference at the edge, from monitoring shelf stock in a retail store environment to checking factory workers are wearing the necessary protective gear. In conjunction with 5G, edge computing is also enabling a new breed of augmented and virtual reality (AR/VR) applications that wouldn’t otherwise be feasible.
The new emphasis on AI accelerators means Qualcomm has found itself dealing with a brand new class of customer, which include not only the hyperscalers but any organization interested in deploying AI at the edge. And this strategy appears to be paying off.
According to the company’s latest earnings figures, the IoT segment (which houses the Cloud AI 100 platform) took in $5.1 billion in fiscal 2021, up 67% on the previous year. And Vildibill told us Qualcomm’s efforts in the server chip space are only going to “continue ramping up”.
Performance, but not at any cost
It’s not just the shift towards the edge that Qualcomm is interested in, however. It’s the intersection of this new trend and another: the drive towards sustainable computing. With many companies now committing to ever more ambitious carbon pledges, the ability to run workloads in a sustainable manner has become a top priority.
“A very important element of the puzzle is that it’s not computing at any cost; you’ve got to be able to do this processing efficiently, in a sustainable way,” explained Vildibill.
“What Qualcomm is trying to do is drive more effective, powerful and power efficient means of processing at the edge, which will save not just on the energy bill, but on the carbon footprint too.”
As Qualcomm continues to explore opportunities in the server chip market, the firm is aiming to develop an extensive roadmap of products with power-efficiency at their heart, Vildibill says. And the company will also continue to enhance its software too, in a bid to draw even greater energy efficiency from its current Cloud AI 100 product line.
If Qualcomm is able to unseat Nvidia, the historic leader in AI acceleration, with this focus on maximizing performance per watt, the economic opportunity could be massive. And despite the company’s relative inexperience in the space, Vildibill is confident about its prospects.
“An increased focus on sustainability, the explosion of AI and the shift towards edge computing have come together to create the perfect storm. And we believe we’re in a perfect position, at the perfect time, to address the market,” he said.
Also check out our lists of the best bare metal hosting, best dedicated server hosting providers and best VPS providers.
Joel Khalili is the News and Features Editor at TechRadar Pro, covering cybersecurity, data privacy, cloud, AI, blockchain, internet infrastructure, 5G, data storage and computing. He's responsible for curating our news content, as well as commissioning and producing features on the technologies that are transforming the way the world does business.