Real-time Lipsync

Integrate real-time facial animation in your React applications.

Supported Formats

ARKit (61 Elements)

Apple's ARKit blendshape format with 52 facial blendshapes + 9 rotation values:

  • 52 standard facial blendshapes (eye, brow, jaw, mouth, cheek, nose)

  • 3 head rotation values (pitch, yaw, roll)

  • 6 eye rotation values (left/right eye gaze)

MetaHuman (251 Elements)

Unreal Engine's MetaHuman format with 251 CTRL_expressions_* blendshapes:

  • Comprehensive facial control (brow, eye, cheek, nose, mouth, jaw)

  • Highly detailed mouth shapes for precise lip-sync

  • Industry-standard for high-fidelity character animation

Configuration Options

ConvaiConfig (useConvaiClient)

interface ConvaiConfig {
  // ... other options
  
  /**
   * Enable lipsync/facial animation blendshapes (default: false).
   * When enabled, streams real-time blendshape data at 60fps.
   */
  enableLipsync?: boolean;
  
  /**
   * Blendshape format to receive from server (default: 'mha').
   * 'arkit' - 61 elements (52 blendshapes + 9 rotation values)
   * 'mha' - 251 elements (MetaHuman format)
   */
  blendshapeFormat?: 'arkit' | 'mha';
}

Example with All Options

Access Blendshape Queue

The blendshape queue is available on the client instance:

Last updated

Was this helpful?