LogoLogo
PlaygroundVideosBlogPricing
  • Welcome
  • Convai Playground
    • Playground Walkthrough
    • Get Started
    • Character Creator Tool
      • Create Character
      • Language and Speech
      • Knowledge Bank
      • Personality and Style
      • State of Mind
      • Memory
      • Actions
      • Narrative Design
      • Character Versioning
    • F.A.Q
  • Plugins & Integrations
    • Introduction
    • Unity Plugin
      • Pre-Requisites
      • Compatibility
      • Downloads
        • Limitations of WebGL Plugin
      • Setting Up Unity Plugin
      • Troubleshooting Guide
        • Disable Assembly Validation
        • Missing Newtonsoft Json
        • Microphone Permission Issues
        • Default Animations Incompatibility
        • Animations have Facial Blendshapes
        • Jaw Bone in Avatar is not Free
        • macOS Permission Issues
      • Creating a Convai Powered Scene from Template
      • Importing Ready Player Me (RPM) Characters
      • Importing Custom Characters
      • Adding Actions to your Character
      • Adding Lip-Sync to your Character
      • Adding Narrative Design to your Character
        • Narrative Design Keys
      • Adding NPC to NPC Conversation
      • Adding Scene Reference and Point-At Crosshairs
      • Utilities
        • Character Emotion
        • Player Data Container
        • Long Term Memory
        • Language Support
        • Managing sessionID Locally
        • Transcript UI System
        • Pre-built UI Prefabs
        • Input Management
        • Notification System
        • Settings Panel
        • Dynamic Information Context
      • Building For Supported Platforms
        • Building for iOS/iPadOS
        • Building for WebGL
        • Convai XR
          • Building for VR
            • VR Automatic Installation
            • VR Manual Installation
          • Building for MR
            • MR Automatic Installation
            • MR Manual Installation
          • Building for AR
          • Interacting with XR UI Elements
        • Building for macOS Universal apps
      • Changelogs
      • Tutorials
        • Narrative Design
        • NPC2NPC
    • Unreal Engine
      • Supported Platforms
      • Installation
      • Guides
        • Dynamic Environment Info
        • Change AI Character Movement Speed
        • Integration with Pixel Streaming
        • Adjust Interaction Radius
        • Creating MetaHuman Characters
          • Adding MetaHuman
          • Adding LipSync to MetaHuman (From plugin version 3.0.0 or later )
          • Change the Parent Class for MetaHuman
          • Change the parent class for Player.
          • Adding LipSync to MetaHuman (Deprecated)
        • Creating ReadyPlayerMe Characters
          • Download Plugins
          • Adding ReadyPlayerMe Character
          • Change the parent class for Player.
        • Creating Reallusion Characters
          • Creating a Reallusion Character
          • Importing Reallusion character and setting up the Convai plugin
          • Binding objects to Reallusion Character
        • Event-Aware Convai Characters
        • Operations Based on the Presence of Certain Words
        • Narrative Design
          • Narrative Design Trigger
          • Narrative Design Keys
        • Actions Guide
          • Stage 1: Default Actions
            • Moves To
            • Follows
          • Stage 2: Custom Actions
            • Simple actions
            • Adding Descriptions to Actions
          • Stage 3: Custom Actions with Single Parameter
          • Main Character and Attention Object
        • Customization
          • Push to Talk
          • Alter Character Response Audio Rate
        • Speech To Text Transcription
        • Enable Multiplayer Support
        • 3D Chat Widget
        • Long Term Memory
        • Saving and Loading Session
      • Blueprints Reference
        • Convai Player
        • Convai Chatbot
        • Convai Environment
        • Convai Object Entry
        • Convai Result Action
        • Convai Extra Params
        • Speech To Text
        • Text To Speech
        • Utility Functions
      • Troubleshoot Guide
        • Missing Unreal Engine Tool Set in Microsoft Visual Studio Toolchain
        • Convai Module Not Found
        • MetaHuman Plugin Conflict
        • Failure to Load Character IDs
      • Microphone Settings
        • Set Audio Gain
        • Set Microphone
        • Microphone test
        • List Microphone Devices
      • Mac Microphone Permission: Required for UE 5.0 and 5.3
      • Guides V2 (Under Development)
        • Getting Started
          • Installation
          • Simple Talking Cube
          • Adding Premade Chat and Settings UI
    • Web Plugin
      • PlayCanvas Plugin
        • Adding External Script
        • First Person View
        • Adding characters to scene
        • Character Animations
        • Convai Integration
        • Chat Overlay
      • Convai Web SDK
        • Getting Started
        • Facial Expressions
        • Actions
      • JavaScript Chat UI SDK
        • Getting Started
        • ChatBubble Props
      • Narrative Design Guide
        • Narrative Design Triggers
        • Narrative Design Keys
      • GLB Characters for Convai
      • GLB/FBX animations for Convai
    • Modding Framework
      • Modding Cyberpunk 2077
    • Other Integrations
      • Roblox
        • Sample Game 1
        • Sample Game 2
        • Code Example
          • Character Conversation API
      • Discord
        • Create a Discord Bot
        • Hosting Discord Bot from Personal Server
        • Hosting Discord Bot from Replit
      • Omniverse Extension
      • Third-Party API Integrations
        • ElevenLabs API Integration
  • Reference
    • Core API Reference
      • Character Base API
      • Interaction API
      • Core AI Setting API
      • Backstory API
      • Chat History API
      • Knowledge Bank API
      • Narrative Design API
      • Action API
      • Language List API
      • Voice List API
      • Character List API
      • Evaluation API
Powered by GitBook
On this page
  • Initialization
  • Receiving Viseme Data
  • Modulating Morph Targets
  • Handling 100fps Animation
  • Handling 100fps Edge Cases

Was this helpful?

  1. Plugins & Integrations
  2. Web Plugin
  3. Convai Web SDK

Facial Expressions

This section provides comprehensive information on integrating and handling facial expressions and lipsync within your web applications using the convai-web-sdk.

Initialization

To kickstart facial expression functionality, initialize the ConvaiClient with the necessary parameters. The enableFacialData flag must be set to true to enable facial expression data.

convaiClient.current = new ConvaiClient({
  apiKey: '<apiKey>',
  characterId: '<characterId>',
  enableAudio: true,
  enableFacialData: true,
  faceModel: 3, // OVR lipsync
});

faceModel : 3 is standard and actively maintained.

Receiving Viseme Data

Retrieve viseme data by incorporating the provided callback. The example code demonstrates how to handle and update facial data.

const [facialData, setFacialData] = useState([]);
const facialRef = useRef([]);

convaiClient.current.setResponseCallback((response) => {
if (response.hasAudioResponse()) {
    let audioResponse = response?.getAudioResponse();
      if (audioResponse?.getVisemesData()?.array[0]) {
      //Viseme data
        let faceData = audioResponse?.getVisemesData().array[0];
        //faceData[0] implies sil value. Which is -2 if new chunk of audio is recieve.
          if (faceData[0] !== -2) {
            facialRef.current.push(faceData);
            setFacialData(facialRef.current);
          }
      }
}

Modulating Morph Targets

Utilize the useFrame hook from react-three-fiber to modulate morph targets based on the received facial data.

import {OvrToMorph} from 'convai-web-sdk'
const blendShapeRef = useRef([]);
const currentBlendFrame = useRef(0);

useFrame((state, _delta) => {
 // Initiate blendshapes
    if(client?.facialData.length > 0){
// OvrToMorph is required for mapping reallusion morphs
    OvrToMorph(client?.facialData[currentBlendFrame.current],blendShapeRef);
    }
  if (currentBlendFrame.current <= blendShapeRef?.current?.length) {
    // Logic to adjust morph targets based on facial data
    currentBlendFrame.current += 1;
  }
});

In addition to facial expressions, the convai-web-sdk allows developers to modulate bone adjustments for specific facial features. Receive bone adjustments for "Open_Jaw," "Tongue," and "V_Tongue_Out" and apply them to your character as demonstrated below:


// If current viseme value is Open_Jaw
characterRef.current.getObjectByName("CC_Base_JawRoot").setRotationFromEuler(jawRotation);
// The jaw rotation value can be modulated using THREE.lerp().
// example
const jawRotation = new THREE.Euler(0,0,1.57);
// Here 1.57 is the base value for closed Jaw.
jawRotation.z = THREE.MathUtils.lerp(jawRotation.z,1.57 + blendShapeRef.current[currentBlendFrame.current-1][blend]*0.2,0.8);

These code examples are specific to reallusion characters.

Handling 100fps Animation

const throttledUpdate = _.throttle(updateAnimation, 10);
const [tick, setTick] = useState(true);

const updateAnimation = () => {
  setTick((tick) => {
    if (tick) {
      return tick;
    }
    return true;
  });
  requestAnimationFrame(throttledUpdate);
};

// Start the animation loop when the component mounts
useEffect(() => {
  requestAnimationFrame(throttledUpdate);
  // Clean up the animation loop when the component unmounts
  return () => {
    cancelAnimationFrame(throttledUpdate);
  };
}, []);

// Use the frame hook to update animations
useFrame((state, _delta) => {
  if (tick) {
    // Logic to get viseme data and alter morph targets accordingly
    setTick(false);
  }
});

Note: Throttle function is not 100% accurate

Handling 100fps Edge Cases

Throttle accuracy may lead to edge cases. Implement the clock setup and handle both above and below 100fps scenarios using the elapsed time.

const [startClock, setStartClock] = useState(false);

useFrame((state, _delta) => {
  if (!startClock || !client?.isTalking) {
    state.clock.elapsedTime = 0;
    if (startClock) setStartClock(false);
  }
  if (client?.isTalking) {
    setStartClock(true);
  }
});

// Handle both above and below 100fps scenarios
if (startClock) {
  if (Math.abs(Math.floor(state.clock.elapsedTime * 100) - currentBlendFrame.current) > 15) {
    if (Math.floor(state.clock.elapsedTime * 100) - currentBlendFrame.current > 0) {
      // Below 100fps
           for(let i=0;i<15;i++){
              blendShapeRef.current.push(0)
            }
            currentBlendFrame.current += 15;
    } else {
      // Above 100fps
       if(blendShapeRef.current.length > 15){
            blendShapeRef.current.splice(-15);
            currentBlendFrame.current -= 15;
       }
    }
  }
}

PreviousGetting StartedNextActions

Last updated 12 months ago

Was this helpful?

Implement using lodash to ensure smooth animations at 100fps. The provided example demonstrates how to maintain a consistent animation frame rate.

throttling