News
News Categories

NVIDIA's ACE brings videogame NPCs to life with AI-enhanced dialogue and animation

By The Count - on 31 May 2023, 10:40pm

NVIDIA's ACE brings videogame NPCs to life with AI-enhanced dialogue and animation

Traditionally, conversing with non-playable characters (NPCs) in video games has been largely pointless, with NPCs offering only a limited set of pre-scripted lines, if any at all. Game developers tend to not prioritise this aspect, choosing instead to focus on plot-related elements, thus sparing the resources required to animate NPCs and craft their dialogues. However, this has often resulted in a lack of depth and engagement in gaming experiences, with NPCs lacking in responsiveness and engagement with the player's experiences.

Aiming to address this shortcoming, NVIDIA has introduced its Avatar Cloud Engine (ACE), an innovative tool that leverages generative AI to breathe life into NPCs. ACE holds the potential to transform gaming by enabling natural interactions with NPCs, thereby enhancing the immersiveness of future games.

Jensen Huang, CEO of NVIDIA, presented ACE during Computex, describing it as a unique AI model creation service that can elevate NPCs through AI-driven conversational abilities and animations. A demo video demonstrated the technology in action as a character engages in conversation with a Ramen shop cook in a cityscape that seems to mirror the world of Cyberpunk 2077's Night City. Even though the dialogue appears somewhat unnatural, the real-time interactive experience offered is quite remarkable.

The company states that the showcased tech demo was facilitated by several technologies that game developers can incorporate to introduce AI-enabled characters in their future games. Firstly, NVIDIA Riva was used to transcribe text into speech with automatic speech recognition (ASR), supported by NVIDIA Nemo, a large language model (LLM) responsible for tailoring NPC responses to the context of the game environment, guided by developers' rules, including language constraints. Lastly, Audio2Face was employed to animate the NPC's facial expressions in real-time, responding to voice inputs. All these components were integrated on a platform called Convai, an AI characters-focused startup that emerged from NVIDIA's Inception program, and the final product was rendered in Unreal Engine 5 using its Metahuman technology.

NVIDIA confirms that multiple companies have started integrating this technology into their upcoming games. However, only two under-development games utilizing just one of these technologies are mentioned, suggesting that it may take a few years for widespread adoption. For instance, the soon-to-be-released game Stalker 2: Heart of Chernobyl plans to use Audio2Face for some of its facial animations. Similarly, indie game studio Fallen Leaf is employing Audio2Face for its forthcoming third-person adventure game set on Mars.

Join HWZ's Telegram channel here and catch all the latest tech news!
Our articles may contain affiliate links. If you buy through these links, we may earn a small commission.