Taking AI to the edge for smaller, smarter, and more secure applications

A person holding out their hand with a digital AI symbol.
(Image credit: Shutterstock / LookerStudio)

AI continues to spark debate and demonstrate remarkable value for businesses and consumers. As with many emerging technologies, the spotlight often falls on large-scale, infrastructure-heavy, and power-hungry applications. However, as the use of AI grows, there is a mounting pressure on the grid from large data centers, with intensive applications becoming much less sustainable and affordable.

As a result, there is a soaring demand for nimbler, product-centric AI tools. Edge AI is leading this new trend, by bringing data processing closer to (or embedded within) devices, on the tiny edge, meaning that basic inference tasks can be performed locally. By not sending raw data off to the cloud via data centers, we are seeing significant security improvements in industrial and consumer applications of AI, which also enhances the performance and efficiency of devices at a fraction of the cost compared to cloud.

But, as with any new opportunity, there are fresh challenges. Product developers must now consider how to build the right infrastructure and the required expertise to capitalize on the potential of edge.

Marc Dupaquier

Managing Director Artificial Intelligence Solutions at Supermicro.

The importance of local inference

Taking a step back, we can see that AI largely encompasses two fields: machine learning, where systems learn from data, and neural network computation, a specific model designed to think like a human brain. These are supplementary ways to program machines, training them to do a task by feeding it with relevant data to ensure outputs are accurate and reliable. These workloads are typically carried out at a huge scale, with comprehensive data center installations to make them function.

For smaller industrial use-cases and consumer industrial applications – whether this is a smart toaster in your kitchen or an autonomous robot on a factory floor – it is not economically (or environmentally) feasible to push the required data and analysis for AI inference to the cloud.

Instead, with edge AI presenting the opportunity of local inference, ultra-low latency, and smaller transmission loads, we can realize massive improvements to cost and power efficiency, while building new AI applications. We are already seeing edge AI contribute towards significant productivity improvements for smart buildings, asset tracking, and industrial applications. For example, industrial sensors can be accelerated with edge AI hardware for quicker fault detection, as well as predictive maintenance capabilities, to know when a device’s condition will change before a fault occurs.

Taking this further, the next generation of hardware products designed for edge AI will introduce specific adaptations for AI sub-systems to be part of the security architecture from the start. This is one area in which embedding the edge AI capability within systems comes to the fore.

Embedding intelligence into the product

The next stage in the evolution of embedded systems is introducing edge AI into the device architecture, and from there its “tiny edge”. This refers to tiny, resource-constrained devices that process AI and ML models directly on the edge, including microcontrollers, low-power processors and embedded sensors, enabling real-time data processing with minimal power consumption and low latency.

A new class of software and hardware is now emerging on the tiny edge, giving the possibility to execute AI operations in the device. By embedding this capability within the architecture from the start, we are making the ‘signal’ itself become the data’, rather than wasting resources transforming it. For example, tiny edge sensors can gather data from the environment that a device is in, leveraging an in-chip engine to produce a result. In the case of solar farms, sensors within a solar panel can specifically detect nearby arc faults across power management systems. When extreme voltages occur, it can automatically trigger a shutdown failsafe and avoid an electrical fire.

With applications like arc fault detection as well as battery management or on-device face or object recognition driving growth in this space, we will see the market for microcontrollers capable of supporting AI on the tiny edge grow at a CAGR of over 100% (according to ABI Research). To realize this potential, more work is needed to bridge the gap between the processing capabilities of cloud-based AI and targeted applications from devices that are capable of working on, or being, the edge.

However, like with any new technology: where there is a demand, there is a way.

We are already seeing meaningful R&D results focused on this challenge, and tiny AI is starting to become embedded in all types of different systems – in some cases, consumers are already taking this technology for granted, literally talking to devices without thinking ‘this is AI’.

Building edge AI infrastructure

To capitalize on this emerging opportunity, product developers must first consider the quality and type of data that goes into edge devices, as this determines the level of processing, and the software and hardware required to deal with the workload. This is the key difference between typical edge AI, operating on more powerful hardware, capable of handling complex algorithms and datasets, and tiny AI, which focuses on running lightweight models that can perform basic inference tasks.

For example, audio and visual information - especially visual - are extremely complex and need a deep neural architecture to analyze the data. On the other hand, it is less demanding to process data from vibrations or electric current measurements recorded over time, so developers can utilize tiny AI algorithms to do this within a resource-constrained or ultra-low power, low latency device.

It is important to consider the class of device and microcontroller unit needed in the development stage, based on the specific computational power requirements. In many cases, less is more, and running a lighter, tiny AI model improves the power efficiency and battery life of a device. With that said, whether dealing with text or audio-visual information, developers must still undertake pre-processing, feeding large quantities of sample data into learning algorithms to train the AI using them.

What’s on the horizon?

The development of devices that embed AI into the tiny edge is still in its infancy, meaning there’s scope for businesses to experiment, be creative, and figure out exactly what their success factors are. We are at the beginning of a massive wave, which will accelerate digitalization in every aspect of our life.

The use-cases are vast, from intelligent public infrastructure, such as the sensors required for smart, connected cities, to remote patient monitoring through non-invasive wearables in healthcare. Users are able to improve their lives, and ease daily tasks, without even realizing that AI is the key factor.

The demand is there, with edge AI and tiny AI already transforming product development, redefining what’s classified as a great piece of technology, enabling more personalized predictive features, security, and contextual awareness. In just a few years, this type of AI is going to become vital to the everyday utility of most technologies – without it, developers will quickly see their innovations become obsolete.

This is an important step forward, but it doesn’t come without challenges. Overcoming these challenges can only happen through a broader ecosystem of development tools, and software resources. It’s just a matter of time. The tiny edge is the lynchpin through which society will unlock far greater control and usefulness of its data and environment, leading to a smarter AI-driven future.

We feature the best Computerized Maintenance Management System software.

This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro

Managing Director Artificial Intelligence Solutions at Supermicro.

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.

Read more
An AI face in profile against a digital background.
Unlocking AI’s Transformative Potential for Competitive Edge
5G
Securing 5G edge network – what companies should know before stepping on the edge of tech
Data center racks with cables and servers
The tipping point for AI and Managed Cloud
AI writer
AI innovation in business: moving beyond scale to drive real results
A graphic showing fleet tracking locations over a city.
How smart cities leverage AI to integrate services and improve efficiency
A person holding out their hand with a digital AI symbol.
How will the evolution of AI change its security?
Latest in Pro
A person holding out their hand with a digital AI symbol.
Taking AI to the edge for smaller, smarter, and more secure applications
Someone looking at a marketing graph
Why ‘boring’ tech will be 2025's biggest marketing trend
Epos Expand Vision 5 Bundle main image
I tested the Epos Expand Vision 5 Bundle - read why this video conferencing solution is recommended
ransomware avast
Ransomware attacks are costing Government offices a month of downtime on average
Biamp MRB-M-X400-T main image
I tested the Biamp MRB-M-X400-T - read what this meeting room solution is actually like
Allied Telesis AT-AR4050S-5G main image
I tried out the Allied Telesis AT-AR4050S-5G - read how this gateway appliance holds up against the competition
Latest in Opinion
A person holding out their hand with a digital AI symbol.
Taking AI to the edge for smaller, smarter, and more secure applications
Someone looking at a marketing graph
Why ‘boring’ tech will be 2025's biggest marketing trend
An AI face in profile against a digital background.
Getting your data ready as the AI race heats up
AI model distillation
Investments, action plans, and the shifting AI landscape
AI model distillation
Why you almost certainly have a shadow AI problem
Hands on a laptop with overlaid logos representing network security
How AI-powered remediation can help tackle security debt