Live2D Virtual Human for Chatting based on Unity
-
Updated
Nov 13, 2023 - C#
Live2D Virtual Human for Chatting based on Unity
Use the NVIDIA Audio2Face headless server and interact with it through a requests API. Generate animation sequences for Unreal Engine 5, Maya and MetaHumans
Audio2Face Avatar with Riva SDK functionality
Web interface to convert text to speech and route it to an Audio2Face streaming player.
[Gemini API Developer Competition] Real-time conversation with the digital twin of William Shakespeare.
General Purpose 3D Humanoid Simulator integrated with AI Agents.
NVIDIA Audio2Face Blendshape Implementation with PyTorch. Uses LSTM and CNN (simplified NvidiaNet) models
OmniAvatar combines NVIDIA Omniverse Audio2Face and LLMs to create avatars that interact with lifelike expression and context-aware responses, making virtual communication more immersive and dynamic in real-time.
Add a description, image, and links to the audio2face topic page so that developers can more easily learn about it.
To associate your repository with the audio2face topic, visit your repo's landing page and select "manage topics."