NVIDIA’s AI has already changed gaming, and it’s still just heating up
Since the introduction of GeForce RTX 20-Series GPUs and their onboard Tensor Cores, NVIDIA has been pushing forward the capabilities of AI in gaming. More features and more games are taking advantage of AI to deliver their game worlds and quality graphics. And with the latest NVIDIA ACE technology, it’s clear NVIDIA has plans to make AI even more integral to gaming than before.
The push into AI kicked off with NVIDIA DLSS, Deep Learning Super Sampling, which lets players enjoy high-fidelity visuals without the typical hit to frame rates that comes alongside them. NVIDIA achieved this by using higher-quality graphics as AI training data so that the Tensor Cores could take low-resolution gameplay and understand how to bridge the gap. Now, DLSS is multiplying resolution and frame rate with AI.
By the same token, NVIDIA Freestyle and RTX Remix provide tools for improving the visuals of games with AI. Freestyle lets users apply filters or even upconvert rendering into HDR for non-HDR games in realtime to enhance their experience. Meanwhile, RTX Remix can do for in-game assets what DLSS did for output resolution, letting modders take in asset files and smartly convert them to higher-quality assets to produce quick remasters of classic games.
While DLSS, NVIDIA Freestyle, and RTX Remix all alter the visual experience of games, NVIDIA ACE is set to change the way people interact with games themselves.
NVIDIA ACE is putting an AI toolset behind in-game NPCs (Non-player characters). Just as the power of large language models have brought chatbots to life in profound ways, NVIDIA ACE can provide a new level of dynamism for NPCs, letting gamers find more engagement throughout their games.
NVIDIA ACE’s Riva speech and Audio2Face animation models allow seamless interaction between players and NPCs. Riva takes spoken prompts from players and transcribes them for the game. AI-powered NPCs respond in natural language that can be processed by Audio2Face to sync facial animations with the speech of the in-game character. Developers can run this whole pipeline from user input to NPC response in the cloud or even locally on the user's PC.
NVIDIA NeMo provides pretrained models and frameworks for developers to create their own language models that can be used to create the brains behind NPCs. Choose from pre-trained language models called Nemotron and create guardrails (the safety system used to keep language models on track and appropriate) using NeMo Guardrails. The frameworks can be used to optimize, fine tune, and deploy these language models with ease on NVIDIA GPUs in the cloud.
As complicated as it might seem, AI has a big role to play in gaming going forward, and NVIDIA ACE is a part of that. To help you make sense of all the latest developments, NVIDIA has introduced the AI Decoded series, which you can check out to stay up to speed on what’s ahead at the intersection of gaming and AI.
Get daily insight, inspiration and deals in your inbox
Sign up for breaking news, reviews, opinion, top tech deals, and more.