• Wed. Jun 19th, 2024

Nvidia unveils Avatar Cloud Engine for Games at Computex

Bynewsmagzines

May 29, 2023
Nvidia unveils Avatar Cloud Engine for Games at Computex

[ad_1]

Connect with top gaming leaders in Los Angeles at GamesBeat Summit 2023 this May 22-23. Register here.


At Computex, Nvidia unveiled its Nvidia Avatar Cloud Engine (ACE) for Games, enabling smarter AI-based non-playable characters (NPCs) for gaming applications.

ACE is a custom AI model foundry service that transforms games by bringing intelligence to NPCs through AI-powered natural language interactions.

Developers of middleware, tools and games can use ACE for Games to build and deploy customized speech, conversation and animation AI models in their software and games, Nvidia announced at the big computer trade show in Taiwan.

“Generative AI has the potential to revolutionize the interactivity players can have with game characters and dramatically increase immersion in games,” said John Spitzer, vice president of developer and performance technology at Nvidia, in a statement. “Building on our expertise in AI and decades of experience working with game developers, NVIDIA is spearheading the use of generative AI in games.”

Event

GamesBeat Summit 2023

Join the GamesBeat community for our virtual day and on-demand content! You’ll hear from the brightest minds within the gaming industry to share their updates on the latest developments.

Register Here

Pioneering generative AI in games

Building on Nvidia’s Omniverse — a metaverse collaboration platform for engineers — ACE for Games delivers optimized AI foundation models for speech, conversation and character animation.

That includes Nvidia NeMoTM— for building, customizing, and deploying language models, using proprietary data. The large language models can be customized with lore and character backstories, and protected against counterproductive or unsafe conversations via NeMo Guardrails.

It also includes Nvidia Riva for automatic speech recognition and text-to-speech to enable live speech conversation.

And it has Nvidia Omniverse Audio2Face — for instantly creating expressive facial animation of a game character to match any speech track. Audio2Face features Omniverse connectors for Unreal Engine 5, so developers can add facial animation directly to MetaHuman characters.

Developers can integrate the entire Nvidia ACE for Games solution or use only the components they need.

Kairos demo offers a peek at the future of games

Nvidia collaborated with Convai, a member of Nvidia’s AI-focused Inception startup program, to showcase how developers will soon be able to use Nvidia ACE for Games to build NPCs. Convai,
which is focused on developing cutting-edge conversational AI for virtual game worlds, integrated ACE modules into its end-to-end real-time avatar platform.

In a demo, called Kairos, players interact with Jin, the purveyor of a ramen shop. Although he is an NPC, Jin replies to natural language queries realistically and consistent with the narrative backstory — all with the help of generative AI. The demo is rendered in Unreal Engine 5 using the latest ray-tracing features and Nvidia DLSS.

“With Nvidia ACE for Games, Convai’s tools can achieve the latency and quality needed to make AI non-playable characters available to nearly every developer in a cost efficient way,” said Purnendu Mukherjee, founder and CEO at Convai, in a statement.

The neural networks enabling Nvidia ACE for Games are optimized for different capabilities, with various size, performance and quality trade-offs. The ACE for Games foundry service will help developers fine-tune models for their games, then Nvidia can deploy them via Nvidia DGX Cloud, GeForce RTX PCs or on premises for real-time inferencing.

The models are optimized for latency — a critical requirement for immersive, responsive interactions in games.

Generative AI to transform the gaming experience

Game developers and startups are already using Nvidia generative AI technologies for their workflows.

For instance, GSC Game World, one of Europe’s leading game developers, is adopting Audio2Face in its upcoming game, S.T.A.L.K.E.R. 2 Heart of Chernobyl.

And Fallen Leaf, an indie game developer, is using Audio2Face for character facial animation in Fort Solis, a third-person sci-fi thriller that takes place on Mars.

Meanwhile, Charisma.ai, a company enabling virtual characters through AI, is leveraging Audio2Face to power the animation in its conversation engine.

GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings.

Leave a Reply

Your email address will not be published. Required fields are marked *