NVIDIA Wants You To Talk To Side Characters In the Virtual World and Have a Customizable Experience In layman's terms, AI-powered technology will give life to non-playable characters in a video game. It is designed to recognize one's speech, will have text-to-speech features, natural language understanding, and will use your voice to animate your character's face and gestures.
You're reading Entrepreneur India, an international franchise of Entrepreneur Media.
On 29th May, NVIDIA founder and CEO, Jensen Huang took centre stage at Taipei Nangang Exhibition Center to deliver a keynote to kick start the four-day computer expo in Taiwan. "Our first live event in four years! I haven't given a public speech in four years. Wish me luck," is what Huang opens with, getting chuckles and claps from the audience. "I have a lot to tell you, very little time."
The 60-year-old business magnate had a lot to share, what he and NVIDIA have done and what they plan to do in the future. Apart from sharing the journey of the company on ray-tracing, and scene rendering, the founder unveiled two new products (RTX 4060 TI and DGX GH200). The DGX GH2000 is an AI supercomputer which will help tech companies create successors to ChatGPT. Notably, Google, Microsoft, and Meta are said to become the first users of the equipment.
However, what stole the show (and has notably left gaming developers worried) was its announcement of NVIDIA Avatar Cloud Engine (ACE) for video games. In layman's terms, AI-powered technology will give life to non-playable characters in a video game. It is designed to recognize one's speech, will have text-to-speech features, natural language understanding, and will use your voice to animate your character's face and gestures.
Source: YouTube
Huang played a video where the user's character (someone backstage) interacted with Jin, a Ramen shop owner. The following conversation shares a little background on Jin and the upcoming task. You can clearly see the distressed expression on Jin's face, which is impressive considering it was never created like this before, nor was he ever given a storyline.
"None of the conversation was scripted. We gave the AI character 'Jin' a backstory…All you have to do is go up and talk to this character. And because this character has been infused with artificial intelligence and large-language models, it can interact with you and understand the meaning..all of the facial animation completely done by the AI. We have made it possible for all kinds of characters to be generated. They have their own domain knowledge, you can customize it. So, everybody's game is different. And look how wonderful they are and how beautiful they are. This is the future of video gaming."
Accordingly to Huang, AI will be a very big part of the future of video games.
"Generative AI has the potential to revolutionize the interactivity players can have with game characters and dramatically increase immersion in games. Building on our expertise in AI and decades of experience working with game developers, NVIDIA is spearheading the use of generative AI in games," said John Spitzer, vice president, developer and performance technology, NVIDIA.
"Although he is an NPC, Jin replies to natural language queries realistically and consistent with the narrative backstory — all with the help of generative AI," said the company in its press release.
To deliver this AI-powered demo called Kairos, NVIDIA partnered with Convai, a conversational AI for the virtual world's startup.
According to the official website, NVIDIA's generative AI technologies will be part of the upcoming GSC Game World's STALKER 2: Heart of Chernobyl and Fort Solis.