LogoLogo
PlaygroundVideosBlogPricing
  • Welcome
  • Convai Playground
    • Playground Walkthrough
    • Get Started
    • Character Creator Tool
      • Create Character
      • Language and Speech
      • Knowledge Bank
      • Personality and Style
      • State of Mind
      • Memory
      • Actions
      • Narrative Design
      • Character Versioning
    • F.A.Q
  • Plugins & Integrations
    • Introduction
    • Unity Plugin
      • Pre-Requisites
      • Compatibility
      • Downloads
        • Limitations of WebGL Plugin
      • Setting Up Unity Plugin
      • Troubleshooting Guide
        • Disable Assembly Validation
        • Missing Newtonsoft Json
        • Microphone Permission Issues
        • Default Animations Incompatibility
        • Animations have Facial Blendshapes
        • Jaw Bone in Avatar is not Free
        • macOS Permission Issues
      • Creating a Convai Powered Scene from Template
      • Importing Ready Player Me (RPM) Characters
      • Importing Custom Characters
      • Adding Actions to your Character
      • Adding Lip-Sync to your Character
      • Adding Narrative Design to your Character
        • Narrative Design Keys
      • Adding NPC to NPC Conversation
      • Adding Scene Reference and Point-At Crosshairs
      • Utilities
        • Character Emotion
        • Player Data Container
        • Long Term Memory
        • Language Support
        • Managing sessionID Locally
        • Transcript UI System
        • Pre-built UI Prefabs
        • Input Management
        • Notification System
        • Settings Panel
        • Dynamic Information Context
      • Building For Supported Platforms
        • Building for iOS/iPadOS
        • Building for WebGL
        • Convai XR
          • Building for VR
            • VR Automatic Installation
            • VR Manual Installation
          • Building for MR
            • MR Automatic Installation
            • MR Manual Installation
          • Building for AR
          • Interacting with XR UI Elements
        • Building for macOS Universal apps
      • Changelogs
      • Tutorials
        • Narrative Design
        • NPC2NPC
    • Unreal Engine
      • Supported Platforms
      • Installation
      • Guides
        • Dynamic Environment Info
        • Change AI Character Movement Speed
        • Integration with Pixel Streaming
        • Adjust Interaction Radius
        • Creating MetaHuman Characters
          • Adding MetaHuman
          • Adding LipSync to MetaHuman (From plugin version 3.0.0 or later )
          • Change the Parent Class for MetaHuman
          • Change the parent class for Player.
          • Adding LipSync to MetaHuman (Deprecated)
        • Creating ReadyPlayerMe Characters
          • Download Plugins
          • Adding ReadyPlayerMe Character
          • Change the parent class for Player.
        • Creating Reallusion Characters
          • Creating a Reallusion Character
          • Importing Reallusion character and setting up the Convai plugin
          • Binding objects to Reallusion Character
        • Event-Aware Convai Characters
        • Operations Based on the Presence of Certain Words
        • Narrative Design
          • Narrative Design Trigger
          • Narrative Design Keys
        • Actions Guide
          • Stage 1: Default Actions
            • Moves To
            • Follows
          • Stage 2: Custom Actions
            • Simple actions
            • Adding Descriptions to Actions
          • Stage 3: Custom Actions with Single Parameter
          • Main Character and Attention Object
        • Customization
          • Push to Talk
          • Alter Character Response Audio Rate
        • Speech To Text Transcription
        • Enable Multiplayer Support
        • 3D Chat Widget
        • Long Term Memory
        • Saving and Loading Session
      • Blueprints Reference
        • Convai Player
        • Convai Chatbot
        • Convai Environment
        • Convai Object Entry
        • Convai Result Action
        • Convai Extra Params
        • Speech To Text
        • Text To Speech
        • Utility Functions
      • Troubleshoot Guide
        • Missing Unreal Engine Tool Set in Microsoft Visual Studio Toolchain
        • Convai Module Not Found
        • MetaHuman Plugin Conflict
        • Failure to Load Character IDs
      • Microphone Settings
        • Set Audio Gain
        • Set Microphone
        • Microphone test
        • List Microphone Devices
      • Mac Microphone Permission: Required for UE 5.0 and 5.3
      • Guides V2 (Under Development)
        • Getting Started
          • Installation
          • Simple Talking Cube
          • Adding Premade Chat and Settings UI
    • Web Plugin
      • PlayCanvas Plugin
        • Adding External Script
        • First Person View
        • Adding characters to scene
        • Character Animations
        • Convai Integration
        • Chat Overlay
      • Convai Web SDK
        • Getting Started
        • Facial Expressions
        • Actions
      • JavaScript Chat UI SDK
        • Getting Started
        • ChatBubble Props
      • Narrative Design Guide
        • Narrative Design Triggers
        • Narrative Design Keys
      • GLB Characters for Convai
      • GLB/FBX animations for Convai
    • Modding Framework
      • Modding Cyberpunk 2077
    • Other Integrations
      • Roblox
        • Sample Game 1
        • Sample Game 2
        • Code Example
          • Character Conversation API
      • Discord
        • Create a Discord Bot
        • Hosting Discord Bot from Personal Server
        • Hosting Discord Bot from Replit
      • Omniverse Extension
      • Third-Party API Integrations
        • ElevenLabs API Integration
  • Reference
    • Core API Reference
      • Character Base API
      • Interaction API
      • Core AI Setting API
      • Backstory API
      • Chat History API
      • Knowledge Bank API
      • Narrative Design API
      • Action API
      • Language List API
      • Voice List API
      • Character List API
      • Evaluation API
Powered by GitBook
On this page
  • Lip Sync System
  • Components of LipSync System
  • Steps to add Lipsync to your Convai Character

Was this helpful?

  1. Plugins & Integrations
  2. Unity Plugin

Adding Lip-Sync to your Character

Learn to add lip sync to your Unity characters using Convai. Improve realism and interactivity.

PreviousAdding Actions to your CharacterNextAdding Narrative Design to your Character

Last updated 5 months ago

Was this helpful?

Lip Sync System

Convai sends Visemes or Blend Shape Frame from back-end depending upon the face model the developer chooses to use and when returned Convai SDK out of the box extracts and parses it and provides it to the Convai LipSync Component, after which the component relies on the SkinMeshRenderer's Blendshape Effectors and Bone Effectors to give Convai powered NPC's realistic lipsync.

Components of LipSync System

Viseme Effector List

This is where the developer will tell the Convai SDK, which index of Blendshape Array will be effector how much from which value. To better explain its working let's understand it with a diagram.

Here, it is saying that whatever value is coming from the server will affect Blendshape at the 116th index by 0.2 multipliers and Blendshape at the 114th index by 0.5 multipliers. The engine representation of this would look something like this.

So, you can make your own Effector list or use one of the many that we ship in the SDK.

How to Create your own Viseme Effector List

Right click inside project panel and head over to Create > Convai > Expression > Viseme Skin Effector which will create a Viseme Effector List Scriptable Object and now you can define your own values.

Viseme Bone Effector List

This is where developer will tell the Convai SDK, how much each value coming from the server will affect the rotation of the bone. To better explain its working let's understand it with a diagram.

Here, bone's rotation will be affected by the values coming from the server multiplied by the values in effects. For example, for TH the value will affect the bone's rotation by a 0.2 multiplier and etc. The engine representation of this would look something like this.

So, you can make your own Bone Effector list or use one of the many that we ship in the SDK.

We use this formula to calculate the rotation

UpdateJawBoneRotation(
new Vector3(
        0.0f, 
        0.0f, 
        -90.0f - CalculateBoneEffect(FacialExpressionData.JawBoneEffector) * 30f
    )
);
UpdateTongueBoneRotation(
new Vector3(
        0.0f,
        0.0f,
        CalculateBoneEffect(FacialExpressionData.TongueBoneEffector) * 80f - 5f
    )
);

How to Create Your Own Viseme Bone Effector List

Right click inside the project panel and head over to Create > Convai > Expression > Viseme Bone Effector which will create a Viseme Bone Effector List Scriptable Object and now you can define your own values.

Convai Lipsync Component

When you attach this component to your Convai Character, you will see something like this.

Let's learn what these learns are

  1. Facial Expression Data

    1. Head | Teeth | Tongue

      1. Renderer: Skin Mesh Renderer which corresponds to that specified part of the body

      2. Viseme Effectors List: How the SkinMeshRenderer's Blendshape will be affected by values coming from server.

    2. Jaw | Tongue Bone Effector

      1. How much of Bone's rotation will be affected by values coming from server?

    3. Jaw | Tongue Bone

      1. Reference to the bone which controls jaw and tongue respectively

  2. Weight Blending Power

    1. Percentage to interpolate between two frames in late update.

  3. Character Emotions

    1. Learn More about Character Emotions here Character Emotion

Steps to add Lipsync to your Convai Character

  1. Select you Convai Powered Character in the hierarchy.

  2. In the inspector panel search for ConvaiNPC component, there you will see Add Component Button.

  3. Click on it and select Convai Lipsync Component and click on apply

  1. Select you Convai Powered Character in the hierarchy.

  2. Click on Add Component

  3. Search for Convai Lipsync

  4. Select Convai Lipsync component

Now you can configure the Component according to your custom configuration or use one of the many Presets Convai ships with the SDK

Now your lipsync component would be ready to use in your application.