Nvidia’s AI market dominance: Can anyone mount a serious challenge?

A hand reaching out to touch a futuristic rendering of an AI processor.
(Image credit: Shutterstock / NicoElNino)

No company has profited from the AI boom to quite the same extent as Nvidia. Best known for its work building semiconductors that power the data centers responsible for AI delivery, Nvidia is now the world’s most valuable company, having increased its share price by 181% since the start of the year.

This sheer dominance is largely built on Nvidia’s position as the undisputed leader in the AI hardware space. The company’s rivals simply cannot compete with the superior performance of its graphics processing units (GPUs) which are the foundation of the widespread adoption of AI tools, with Nvidia’s technology better in terms of versatility and raw performance.

However, despite so much recent success, there are still several challenges that threaten to chip away at Nvidia’s competitive advantage. The cost of producing these GPUs remains very high, while the accelerated usage of AI raises environmental concerns as the technology involved is energy intensive.

With this in mind, are we likely to see Nvidia continue its meteoric rise in the coming months and years? Or will the tech behemoth’s competitors find a way to bridge the gap?

Dorian Maillard

Vice President at DAI Magister.

Segregating the market

The AI semiconductor market can be split into two sections: training and inference applications, with training only occurring for GPUs in data centers and inference performed on servers or edge devices. As a result, there are essentially three market segments organizations seeking to gain a foothold in the industry can target.

Edge AI inference is driven by the need for enhanced data security, as reducing reliance on cloud servers minimizes the risk of data breaches. Moreover, edge devices offer real-time data processing, zero latency and autonomy, enhancing overall performance.

Cost savings are another significant factor, as reducing dependence on costly cloud services for AI can result in substantial decreases in total cost of ownership. Furthermore, reduced power consumption and carbon emissions from edge devices align with environmental, social and governance (ESG) goals.

Nvidia’s training market monopoly

In the GPU training sector, Nvidia’s dominance is overwhelming, boasting a 98% market share compared to rivals such as Google and Intel. This unparalleled level of success is unlikely to fade anytime soon, thanks to the performance level of Nvidia’s semiconductors and the synchronized ecosystem the company has established.

Essentially, cutting-edge GPUs and comprehensive software support make Nvidia the go-to solution for many data centers and high-performance computing applications. As a result, any potential rivals in this sector face insurmountable barriers to entry if they have designs on challenging Nvidia.

Leveraging the edge inference gap

It is the edge AI inference market where companies have the greatest chance of infiltrating the semiconductor industry.

Perhaps the most noteworthy factor here is that emerging companies are championing the deployment of Neural Processing Units (NPUs), a lower-power, more specialized alternative to GPUs. NPUs are engineered to accelerate the processing of AI tasks, including deep learning and inference. They can process large volumes of data in parallel and swiftly execute complex AI algorithms using specialized on-chip memory for efficient data storage and retrieval.

While GPUs possess greater processing power and versatility, NPUs are smaller, less expensive and more energy efficient. Counterintuitively, NPUs can also outperform GPUs in specific AI tasks due to their specialized architecture.

In addition, the business model adopted by many startups favoring NPUs allows them to focus finite resources on research, development, intellectual property and their go-to-market, while leveraging capex-heavy foundries such as TSMC and Samsung for actual chip fabrication.

By focusing on core competencies in chip architecture and driving the advancement of NPUs, these fabless firms are reshaping the AI semiconductor landscape, positioning themselves as pivotal players driving the next wave of technological advancement.

Who is attracting the most attention?

California-based SiMa.ai has raised an impressive $270 million to date, with the company’s platform accelerating the proliferation of high-performance machine learning inference at very low power in embedded edge applications. Similarly, Etched, another Californian company, provides AI-based computing hardware specialized for transformers, designed to radically cut LLM inference costs.

Texas-based Mythic is now valued at over $500 million, offering desktop-grade GPUs in a chip that operates at a fraction of the cost without compromising performance or accuracy. Another firm garnering significant interest is Quadric, who develop edge processors for on-device AI computing.

Europe has seen the emergence of promising companies, despite lacking a cohesive AI semiconductor strategy. Axelera AI, a Netherlands-based software AI platform to accelerate computer vision in edge devices, announced a €63 million Series B raise in late June 2024, led by EICF, Samsung Catalyst, and Innovation Industries.

A cost-effective, energy-efficient alternative

The startups listed above have such growth potential because they thrive in areas where Nvidia is vulnerable. NPU edge devices are a viable alternative to GPUs as they address pressing issues of cost, size and energy consumption, with their applications ranging from industrial IoT to autonomous vehicles.

Unseating the might of Nvidia will be no mean feat, but with larger tech giants like Microsoft, AWS, and Google actively seeking to develop or acquire AI chip technologies, market consolidation is on the horizon which could disrupt the balance of power.

We've listed the best business cloud storage.

This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro

Dorian Maillard, Vice President at DAI Magister.