arrow-left

Only this pageAll pages
gitbookPowered by GitBook
triangle-exclamation
Couldn't generate the PDF for 247 pages, generation stopped at 100.
Extend with 50 more pages.
1 of 100

Convai Latest Documentation

Loading...

Convai Playground

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

No Code Experiences

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Plugins & Integrations

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Get Started

A guided overview of the first steps in Convai Playground, including navigation, character creation, testing, and essential global controls.

hashtag
Introduction

hashtag
Welcome to Convai Playground!

Your workspace for creating, customizing, and testing AI-powered characters. This page gives you a high-level overview of the core tools and workflows, helping you get productive quickly. Each section below links to a dedicated page where you can dive deeper.

hashtag
Prerequisites

  • A Convai account.

hashtag
Core Concepts

  • Character – An AI persona you create and customize with unique personality traits, language, knowledge, and behavior settings.

  • Avatar Studio – A no-code editor where you can design your character’s visual appearance and configure its Avatar Studio Experience, including environment, animations, interaction settings, and more.

hashtag
Quick Start Flow

  • Dashboard Overview Learn the layout, view your recent characters and experiences, and see where to create a new character or Convai Simulation Experience. Continue to:

  • Creating a New Character Start building your AI character by naming it, defining its description, choosing a language and voice, and setting its personality. Continue to:

  • Testing a Character Test your character in real time using the Chatbox for text and voice interactions, or via Video Call for a more immersive experience. Continue to:

Character Description

Learn how to use the Character Description page in Convai Playground to define your character’s identity, speaking style, and unique traits.

hashtag
Introduction

The Character Description page is where you define the personality, backstory, and communication style of your AI character. Each character in Convai Playground has its own dedicated Character Description page, ensuring a unique identity that can be refined over time.


hashtag
Accessing the Character Description Page

You can reach the Character Description page by clicking any Character Card on your Dashboard. This opens the character’s profile, where you can edit and manage its core attributes.

hashtag
Main Features and Sections

hashtag
1. Character Name and ID

  • Character’s Name – Editable field for your character’s display name.

  • Character’s ID – A unique identifier for the character, essential for using it in Convai SDKs and API integrations.

  • You can copy the ID to use in your applications.

circle-check

Support Tip: If you need help from the support team, provide this ID when reporting character-related issues.


hashtag
2. Core Description / Speaking Style / Embodiment

Core Description

  • Add details about your character’s story, personality traits, distinctive features, and any behavioral guidelines.

  • Word limit: 1000 words.

Speaking Style

  • Describe How the Character Speaks – Outline the character’s tone, pace, formality, and speech patterns. Include unique expressions or phrases they commonly use.

  • Sample Dialogues – Provide example sentences showcasing the character’s typical speech style, including signature phrases that reinforce their personality.

Embodiment

  • Currently in development and will be available soon.


hashtag
Examples

For inspiration, explore Sample Characters in the Dashboard. These examples show how different characters’ Core Descriptions and Speaking Styles are structured.


hashtag
Conclusion

The Character Description page is the foundation of your AI character’s identity. By clearly defining its personality, voice, and unique traits, you ensure consistent and engaging interactions.

Character Customization

Learn how to refine your AI character’s personality, appearance, knowledge, and behavior to create consistent and engaging interactions.

hashtag
Introduction

The Character Customization section in Convai Playground is where you transform a basic AI character into a fully realized persona. Here, you’ll define how your character looks, speaks, thinks, remembers, and responds, ensuring a unique and immersive experience for your users. Each page in this section focuses on a specific customization area, allowing you to work step-by-step or revisit any aspect at any time.


hashtag
Core Customization Areas

  1. Character Description Define your character’s backstory, personality, and speaking style — the foundation of its identity. Continue to:

  2. Avatar Section Open Avatar Studio to design your character’s visual appearance and configure its Avatar Studio Experience, including environment, animations, outfits, lighting, camera angles, and more. Continue to:

  3. Language And Speech Configure the language, voice, and tone your character uses for communication. Continue to:


hashtag
Best Practices

  • Start with core identity settings (Character Description, Language and Speech) before moving to advanced customization.

  • Use Update frequently to save progress and avoid losing changes.

  • Test your character regularly to ensure each customization change has the desired effect.


hashtag
Next Step

Begin with Character Description to establish the personality and tone of your AI character before moving on to appearance, speech, and advanced behaviors.

Avatar Studio

Learn how to access and customize your character’s avatar in Convai Playground using Avatar Studio.

hashtag
Introduction

The Avatar section, lets you design and customize both the visual appearance of your AI character and its dedicated Avatar Studio Experience. All customization is handled through Avatar Studio—a powerful no-code tool where you can adjust the character’s look, clothing, and animations, as well as personalize the interactive environment it appears in.


Testing a Character

Learn the different ways to test your AI character in Convai Playground, including text chat, voice input, and video call with an avatar.

hashtag
Introduction

Once you have created a character, it’s important to test how it interacts. Convai provides multiple ways to test your characters — from quick text and voice interactions to fully immersive video calls with custom avatars. This ensures you can refine personality, responsiveness, and interaction style before final deployment.


Dashboard Overview

Learn how to navigate and use the Convai Playground Dashboard to manage characters, experiences, and simulations efficiently.

hashtag
Introduction

The Convai Playground Dashboard is your central hub for managing AI-powered characters and immersive simulation experiences. From here, you can easily create new characters, set up experiences, and access sample characters created by the Convai team.


Interact in Voice Mode (Beta)

Enable real-time, low-latency voice conversations with your Convai character using Voice Mode for natural, hands-free interactions.

hashtag
Introduction

Voice Mode allows you to have seamless, natural, and low-friction voice conversations with your character. This guide explains how to set up Voice Mode, select the right interaction method, and maintain stable, real-time sessions for a smooth conversational experience.

Welcome

Start here to create, customize, test, and share interactive AI characters with Convai, covering Playground, no-code experiences, plugins and integrations, and API reference.

hashtag
Welcome to the Official Convai Documentation

Your platform for building, customizing, and deploying intelligent, interactive AI characters across various environments and platforms.

Whether you’re a developer, designer, or creator, this documentation will guide you through every step from your first login to building fully immersive experiences with Convai’s powerful tools and integrations.

  • Knowledge Bank Provide your character with information and reference materials to answer questions and maintain context. Continue to: Knowledge Bank

  • Personality Traits Adjust behavioral sliders to shape your character’s mannerisms, confidence, empathy, and other interaction styles. Continue to: Personality Traits

  • Core AI Settings Fine-tune advanced AI parameters to influence decision-making, creativity, and responsiveness. Continue to: Core AI Settings

  • State of Mind Define temporary or situational mindsets that influence how your character reacts in specific contexts. Continue to: State of Mind

  • Memory Review your character’s past conversations or enable Long Term Memory to allow recall across sessions. Continue to: Memory

  • Narrative Design Create structured narratives or guided interaction flows for your character to follow. Continue to: Narrative Design

  • External API Connect your character to external systems or APIs to retrieve live data or perform actions. Continue to: External API

  • Publish Share your character with others or embed it into your website. Continue to: Publish

  • Character Description
    Avatar Section
    Language And Speech
    hashtag
    Accessing Avatar Studio from Convai Playground
    • Open any character in Convai Playground.

    • In the left-hand menu, click the Avatar section.

    • This will launch Avatar Studio, where you can customize your character’s appearance and configure its Avatar Studio Experience, including the environment, animations, outfits, lighting, camera angles, and more.


    hashtag
    Next Step: Customize in Avatar Studio

    For a complete guide to using Avatar Studio and its features, see our dedicated documentation:

    Read the Avatar Studio Documentationarrow-up-right


    hashtag
    Conclusion

    The Avatar section serves as a quick link to Avatar Studio, where you can customize both your character’s appearance and its Avatar Studio Experience. Whether you’re creating realistic personas or stylized avatars, Avatar Studio provides tools to fine-tune the environment, animations, outfits, lighting, camera angles, and more—ensuring your character’s visual presence matches its personality and role.

  • Global Character Controls Learn about tools available across all character pages, such as Versioning, Update, and Character Settings. Continue to: Global Character Controls

  • Character Versioning Save and switch between different versions of your character for safe experimentation and iteration. Continue to: Character Versioning

  • Dashboard Overview
    Creating a New Character
    Testing a Character
    hashtag
    Testing Options

    hashtag
    1. Quick Test with Chatbox

    If you want to test your character quickly without an avatar visual, you can use the Chatbox for text and voice interactions.

    • From your Dashboard, click on the character you want to test.

    • In the bottom-right Chatbox:

      • Type a message in the text input and press enter.

      • Or click the microphone button, speak, and click the microphone button again when finished to send your voice input.

    • Continue the conversation to evaluate your character’s responses.

    Additional Chatbox Features:

    • Conversation Starters: At the bottom of the Chatbox, you’ll see dynamically generated conversation starters and quick replies, tailored to the ongoing dialogue.

    • Reset Chat: Located at the top-left of the Chatbox, this button restarts the session.

    • Copy Chat: Found directly below the Reset Chat button, it allows you to copy the conversation text.

    • Temporary Username: The bottom-most button lets you set a temporary username.

    • Feedback Buttons: On the right side of each character response, you’ll see thumbs-up and thumbs-down icons to provide positive or negative feedback about the reply.


    hashtag
    2. Video Call

    You can test your character in a video call for both visual and voice interactions. You have two ways to start a video call:

    From the Dashboard:

    • Locate the character’s card.

    • Click the green camera icon to start a video call instantly.


    From the Character Page:

    • Click the character’s card to open its details.

    • In the top-right section, under the character’s thumbnail, click the video call button.

    This method allows you to experience both the character’s visual appearance and voice in real-time, offering a more immersive test environment.


    hashtag
    Conclusion

    Convai Playground’s testing options make it easy to evaluate and refine your characters. Whether you prefer a fast, text-based interaction or a full video call experience with your custom avatar, you can ensure your AI behaves exactly as intended before sharing it with others.

    hashtag
    Navigating the Dashboard

    hashtag
    Main Dashboard View

    When you log in, the Dashboard displays:

    • Recent Characters – Quickly access and edit your most recently used characters.

    • Recent Simulations – View and manage your latest Convai Sim Experiences.

    • Start a New Simulation – Choose from available scene templates such as Airport, Healthcare, Fire Station, Hotel, Police Station, Restaurant, Fitness, Science Lab and more.

    • Sample Characters – Browse pre-made characters created by the Convai team for quick testing and inspiration.


    hashtag
    Creating New Content

    In the top-right corner, you’ll find:

    • Create a new experience – Start building a new Convai Sim Experience from scratch.

    • Create a new character – Design and customize AI-powered characters for your projects.


    hashtag
    Sidebar Navigation

    On the left sidebar, you can access:

    • Dashboard – Return to the Dashboard

    • My Characters – Access your characters.

    • Create Character – Launch the character creation tool directly.

    • My Experiences – Access and edit your simulation experiences.


    hashtag
    Profile and Settings

    In the top-right profile section:

    • Click your profile name to open a dropdown menu with:

      • My Profile – Manage personal account details.

      • Billing – View usages and update payment information.

    • To the left of your profile name, you will find the API Key access button to get your API Key.


    hashtag
    Top Navigation Bar

    From the top navigation menu, you can directly reach:

    • Playground

    • Documentation

    • Videos

    • Plugins

    • Pricing

    • Contact


    hashtag
    Conclusion

    The Convai Playground Dashboard is designed for efficiency, providing quick access to all tools and resources you need to create and manage AI-driven characters and immersive simulations. Whether you are customizing existing assets or building new experiences from scratch, the intuitive layout ensures a smooth workflow.

    hashtag
    Step-by-Step Guide

    hashtag
    1. Open Your Character

    • From your Dashboard, open the character you want to use with Voice Mode.

    • Navigate to Core AI Settings and ensure you’ve selected a Live Model.

    hashtag
    2. Configure Voice Settings

    • Go to Language and Speech and select a voice for your character.

    circle-exclamation

    Choose a voice other than GCP. GCP voices are not compatible with live models.

    • Once configured, click Update to save your changes.

    hashtag
    3. Using Voice Mode

    • After updating your character, a microphone button will appear next to the chat input field.

    • Click this button to enter Voice Mode, allowing you to talk to your character hands-free in real time.

    • When you exit Voice Mode, the conversation transcript will appear in the chat area for review.

    circle-exclamation

    If you remain idle for more than 5 minutes, the voice session automatically disconnects. Simply reconnect to resume your conversation.


    hashtag
    What You’ll Find Here

    Our documentation is divided into several sections so you can easily find what you need:

    hashtag
    1. Convai Playground

    Learn to create, customize, and test your AI characters directly in Convai Playground.

    • Get Started – Basics of navigating the dashboard, creating your first character, and testing interactions.

    • Character Customization – Deep dive into the tools for defining your character’s appearance, voice, knowledge, traits, and more.

    hashtag
    2. No Code Experiences

    Create interactive AI experiences without writing a single line of code.

    • Avatar Studio Experiences – Create and customize your character’s visual identity, including appearance, clothing, accessories, environment, animations, lighting, camera angles, and more, all within an easy-to-use no-code editor.

    • Convai Sim Experiences – Build interactive, large-scale simulation environments where your AI characters engage in realistic scenarios, navigate spaces, and interact with objects.

    • Convai XR Animation Capture App – Capture high-fidelity motion data using your XR device and apply realistic animations to your avatars for more immersive, lifelike performances.

    hashtag
    3. Plugins & Integrations

    Extend your characters into your applications and games.

    • Unity, Unreal Engine, and Web Plugins.

    • Modding Frameworks and Other Integration options.

    • Convai Pixel Streaming Embed capabilities.

    hashtag
    4. API Reference

    In-depth API documentation for advanced customization and integration.


    hashtag
    Before You Begin

    To start building with Convai, you only need:

    • A Convai account – Sign up herearrow-up-right if you don’t have one.


    hashtag
    Getting Help

    • If you can’t find what you’re looking for, use the search bar at the top of the documentation to quickly locate relevant topics.

    • For inspiration, check out the Sample Characters available in your Dashboard.

    • If you need further assistance, visit the Convai Developer Forumarrow-up-right to connect with the community and get support from the Convai team.

    Creating a New Character

    Learn how to create and customize a new AI character in Convai Playground, including description, avatar, voice, and language settings.

    hashtag
    Introduction

    The Convai Playground allows you to design AI-powered characters with unique personalities, voices, and visual appearances. This guide will walk you through creating a new character, from initial setup to customization of avatar, voice, and languages.

    hashtag
    Step-by-Step Guide

    hashtag
    1. Access the Creation Tool

    • From your Dashboard, click Create a new character in the top-right corner.

    • A new character creation interface will open.

    • In the left menu, only Character Description, Avatar, and Voice And Languages are active initially. Other sections will unlock after the character is created.


    hashtag
    2. Character Description

    • Character’s Name – Enter the name for your character (you can edit this later).

    • Core Description – Write a short background covering the character’s story, personality traits, and distinctive features.

      • Alternatively, click Generate Core Description to create one automatically.


    hashtag
    3. Avatar Customization

    • Click the Avatar tab in the left menu.

    • Select Configure Avatar to customize your character’s visual appearance.

    • Follow the steps in the to adjust facial features, clothing, and other design elements.


    hashtag
    4. Voice and Language Settings

    • Click the Voice And Languages tab.

    • In Language, select one or more languages your character can speak.

    • In Voice, choose from the filtered voice options for your selected languages.


    hashtag
    5. Finalizing Character Creation

    • Once all desired settings are configured, click Create Character at the bottom right.

    • If you skip customization, a random avatar and voice will be assigned automatically.

    • If you don't choose any language, English will be selected by default.


    hashtag
    After Creation

    When the character is created, additional sections in the left menu become available for deeper customization:

    • Character Description

    • Avatar

    • Language and Speech

    • Knowledge Bank

    Each of these features allows you to enhance and refine your character for more natural, intelligent, and engaging interactions. These are covered in separate documentation.


    hashtag
    Conclusion

    The character creation process in Convai Playground is designed for flexibility — you can launch a character in minutes or spend time refining every detail. Whether you start with default settings or fully customize the avatar, voice, and description.

    Character Versioning

    Manage multiple versions of character and switch between them as required.

    hashtag
    Introduction

    In this section, we look into Character Versioning, i.e., maintaining different states of the character. This enables the user to preserve a previous stable state before trying out more changes. You can now experiment without the fear of losing an older state of the character, and in case you want discard the current changes and return to a previous version, you now have the ability to restore the version and continue working from there. We conveniently call these saved states as Snapshots of the character.

    We will go over the features and how to use them in this section.

    Language And Speech

    Learn how to configure languages, voices, custom pronunciations, and word recognition for your AI character in Convai Playground.

    hashtag
    Introduction

    The Language and Speech section allows you to define the spoken languages, select a voice, and improve pronunciation and recognition for your AI character. With support for multiple languages and voice providers, you can ensure that your character communicates naturally and effectively with your audience.


    Uploading Avatars

    circle-exclamation

    Currently, only Metahuman and Reallusion avatars are supported for upload.

    Lighting Adjustments

    hashtag
    Set the Right Mood with Lighting

    Lighting plays a key role in how your avatar looks and feels within the environment. It affects not only visibility, but also the overall tone and atmosphere of the scene.

    Here’s how you can adjust lighting for your avatar:

    Introduction

    Introduction to Convai's plugins and integrations. Learn how to enhance your projects with AI.

    Convai provides a variety of plugins and integrations to help integrate conversational AI and avatars into your projects.

    hashtag
    Game Engines

    hashtag

    Installation

    Choose an installation method and add the Convai Unity SDK to your project.

    hashtag
    Introduction

    This page helps you choose the right installation method for your workflow. If you’re starting fresh, we recommend UPM for the smoothest update path.

    hashtag

    Metahuman Avatarschevron-right
    Reallusion Avatarschevron-right

    Speaking Style (optional) – Define the way your character speaks, including tone and mannerisms.

  • Visibility Settings – Choose who can interact with your character:

    • Public – Available to anyone on x.convai.com.

    • Unlisted (default) – Only accessible via a direct link.

    • Private – Restricted to you and invited users.

  • Click Save Changes when done.

    You can update any attribute later, so there’s no need to finalize all decisions immediately.

    Personality Traits

  • Core AI Settings

  • State of Mind

  • Narrative Design

  • External API

  • Publish

  • Memory

  • Avatar Studio Documentation
    Web Plugin

    hashtag
    Modding Framework

    Unity Pluginchevron-right
    Unreal Enginechevron-right
    PlayCanvas Pluginchevron-right
    Convai Web SDKchevron-right
    Modding Frameworkchevron-right
    Modding Cyberpunk 2077chevron-right
    Installation methods
    • UPM (Recommended) Best for most teams. Simple versioning and easy upgrades.

    • Unity Asset Store (Coming soon) Ideal for teams that prefer Asset Store distribution and “My Assets” installs.

    hashtag
    Next step

    After installation, continue to Setup → Configure API Key.

    hashtag
    Conclusion

    You’ve selected the best installation path for your project. Proceed with the installation method that matches your workflow, then move on to Setup.

    circle-info

    Need help? For questions, please visit the Convai Developer Forumarrow-up-right.

    Install via UPM (Recommended)chevron-right

    Utilities

    Unity Plugin Utilities - Enhance development with Convai's tools and resources.

    circle-info

    The idea of Snapshot and Version has been used interchangeably in the text; however, they refer to the same idea: The state / contents that define the character at a specific point in time.


    hashtag
    Overview

    The character versioning option is available at the top right-hand side in the character editor section beside the Update button

    Once you click on it, you get to see the list of all your previous saved revisions ordered by date.

    Character Versioning section. There is no snapshot here yet.

    We will go over the steps of creating and maintaining snapshots from scratch in the next section


    hashtag
    Create a Version

    Let us start with a character that we already have saved. The data that we see when we open the details related to a character denotes the Current Snapshot of the character. When you interact with the character, you are essentially referring to all the date in this Current Snapshot of the character.

    1. To create a new version, first open the Character Versioning section and click on the + Add Snapshot sign at the top.

      Let's create our very first snapshot.
    2. A pop-up appears asking you to give your snapshot a name and some description. Please note that a Snapshot Name is a required field to create a new version. Once you have filled the details, click on the Submit button.

      We provide a name and a small description.
    3. Now, you can see the new version in the list of snapshots. Now what does this version actually represent? This snapshot stores all the data related to the character at that point of time. Everything about the character ranging from character description, embodiment to knowledge bank files, narrative-design structure and other details.


    hashtag
    Restoring a Version

    Assuming you have gone ahead and worked on the character further, but you are unhappy with the results and want to go back and start from the previous version. This is where you have the ability to restore an old snapshot to the current state and work with them again. Here are the steps to follow:

    1. To restore a version, open the Character Versioning section and select the snapshot you want to restore back. You will see the Restore Version button below come to life.

      We will be restoring the data from the very first snapshot.
    2. Once you click on the Restore Version button, a pop-up appears asking you if you want to save the current changes as a new snapshot or discard them. You have the option to store your current changes as some test version and refer back later on.

      Let's directly restore the data in the snapshot to the Current Snapshot
    3. For now, we are happy to discard the changes, so we will click on Restore button. This brings the data from the selected version to the Current Snapshot of the character.

    4. To save the changes, you can always Cancel and go back to creating a new snapshot with your progress and the restoring it.


    hashtag
    Delete a Snapshot

    You can also go ahead and delete a snapshot that you no longer require. To that you can click on the 3-dots by the corresponding snapshot in the list of Character Version and select Delete Version

    Click on the 3-dots beside the snapshot to see all the options.

    hashtag
    Some important points to remember

    At any given point you can interact with the Current Snapshot of the character. If you have any publicly available app that utilises the character, your users will only be able to interact with this current version.

    circle-info

    We are currently working on a feature to help developers have separate deployed version than the Current Snapshot.

    hashtag
    Main Features

    hashtag
    1. Set Language

    • Choose the languages your character can speak and understand.

    • Supports multilingual select between 1 and 4 languages.

    • Default language: English.

    • Over 65+ languages are available.

    • Selecting a language will filter the available voices in the Voice section.


    hashtag
    2. Voice Selection

    • The Voice field provides access to over 1,200 voices in total. When you select a language, the available voices are filtered accordingly, so the number of voices varies by language.

    • Supported voice providers:

      • Google Cloud Platform (GCP)

      • Microsoft Azure

      • OpenAI

      • ElevenLabs

    • Custom Voices can be added through ElevenLabs.

      • See: for setup details.


    hashtag
    3. Add Custom Pronunciation

    Custom pronunciations help your character pronounce specific words correctly, especially unusual or brand-specific terms.

    • To add:

      • Spelled As – The word as it appears in text.

      • Pronounced As – How it should sound, written phonetically in plain English.

    • Example:

      • Spelled As: convai

      • Pronounced As: convey

    • Case-sensitive: Uppercase and lowercase entries can have different pronunciations.

    triangle-exclamation

    Currently only supports English.


    hashtag
    4. New Word Recognition

    New Word Recognition improves your character’s ability to understand unique or challenging words in speech input.

    • To add:

      • Spelled As – The correct spelling of the word.

      • Pronounced As – The phonetic pronunciation using simple syllables.

    • Example:

      • Spelled As: Ankur

      • Pronounced As: Ahnkur

    triangle-exclamation

    Currently only supports English.


    hashtag
    Conclusion

    The Language and Speech settings provide complete control over how your character communicates, from language selection and voice choice to fine-tuning pronunciation and recognition. These tools help ensure your AI character delivers clear, accurate, and engaging interactions for users.

    Choose a Lighting Preset Use the dropdown menu to select from several preset lighting setups.

  • Adjust Lighting Power Use the power level slider to fine-tune the light.

  • circle-check

    Subtle lighting changes can make a big difference in realism — experiment to see what best fits your character and scene.

    Global Character Controls

    A single reference for the shared toolbar controls available across all character pages in Convai Playground, including Versioning, Update, and Character Settings.

    hashtag
    Introduction

    This page explains the shared controls that appear at the top right of every character page in Convai Playground. You will see the same toolbar on Character Description, Avatar, Language and Speech, Knowledge Bank, Personality Traits, Core AI Settings, State of Mind, Embodied Actions, Narrative Design, External API, Publish, and Memory. Understanding these controls helps you work faster and avoid losing changes.


    hashtag
    Where to find these controls

    Look at the top right of any character screen. You will see:

    • Versioning icon

    • Update button

    • Character Settings menu (three dots)

    These controls behave the same way on every page.


    hashtag
    Controls overview

    hashtag
    1. Character Versioning

    Use Versioning to save and switch between alternative definitions of your character.

    • What it does

      • Saves a named snapshot of your character definition so you can test new ideas without losing a preferred setup.

      • Lets you switch to any saved version and continue editing from there.

    For more information, refer to the documentation.

    hashtag
    2. Update button

    Apply your unsaved changes to the character.

    • States

      • Green: there are unsaved edits. Click Update to save.

      • Gray

    hashtag
    3. Character Settings Menu

    Open the three dots menu to access actions that affect the entire character.

    hashtag
    Clone Character

    Create a duplicate so you can branch work safely.

    • What is copied: all character configuration tabs (e.g., Description, Personality, Languages, etc.) are copied, except the Memory tab.

    • What changes: the clone receives a new Character ID.

    • When to use: large experiments, staging vs production split, A or B variants.

    hashtag
    Share Character

    Let others test your character.

    • What it does: Generates a share link so recipients can interact with the character (e.g., Chatbox or video call) without being able to modify it.

    • How it works: Open the dialog to copy a share link, respecting your current visibility setting (Public, Unlisted, or Private).

    hashtag
    Delete Character

    Remove the character from your characters.

    circle-exclamation

    Deletion is permanent and cannot be undone.

    • Checklist before deletion

      • Confirm the character is not used in any live experience.

      • Export or copy any content you may need.

    Knowledge Bank

    Learn how to upload, manage, and connect knowledge files to your AI character using Knowledge Bank.

    hashtag
    Introduction

    The Knowledge Bank is where you store and manage information that your AI character can access during conversations. By uploading documents or adding text directly, you can give your character specific domain knowledge, enabling more accurate, relevant, and context-aware responses.

    All files uploaded to your Knowledge Bank are linked to your Convai account and can be connected to any of your characters. This makes it an essential tool for training characters to respond with company-specific, product-specific, or topic-specific information.


    hashtag
    Knowledge Bank Sections

    hashtag
    1. My Documents

    • Displays all files uploaded to your account.

    • Information shown:

      • Name – File name.


    hashtag
    2. Upload Knowledge

    • Upload .txt files from your computer.

    • Currently, only .txt file format is supported.

    • Once uploaded, files are stored in your account’s Knowledge Bank for use with any character.


    hashtag
    3. Add Knowledge

    • Create a new file by entering plain text directly into the editor.

    • Name the file and save it in .txt format.

    circle-check

    Frequently restart the page during the learning phase to check if the file status is “Available.”


    hashtag
    Using the Knowledge Bank with Your Character

    hashtag
    Example

    We uploaded a file named Employee Onboarding Guide.txt with the following content:

    hashtag
    Testing Without Connecting the File

    • Open the Chatbox.

    • Ask: “I’m a new hire. What should I do during my first week here?”

    • Result: The character responds using its general personality and AI model knowledge, not the uploaded file.

    hashtag
    Testing by Connecting the File

    • Go to Knowledge Bank → My Documents.

    • Click Connect on the file.

    • In the Chatbox, click Reset Chat (top left) to start a new session.

    Result: This time, the character’s response is based on the exact steps provided in the Employee Onboarding Guide file.


    circle-info

    Always reset the chat session after connecting a new knowledge file so the latest data is used.

    circle-info

    The total storage size for uploaded files depends on your Convai subscription plan. See the page for limits.


    hashtag
    Conclusion

    The Knowledge Bank is a powerful way to give your characters precise and reliable information. By connecting domain-specific documents, you ensure that your AI not only has personality but also the expertise to answer questions with accuracy and authority.

    State Of Mind

    Learn how the State of Mind feature visualizes your AI character’s emotional state in real time during conversations.

    hashtag
    Introduction

    The State of Mind section provides a visual representation of your AI character’s current emotional state. This dynamic emotional map helps you understand how your character is responding internally during a conversation, allowing for fine-tuning of its personality and interaction style.


    hashtag
    How It Works

    • The State of Mind interface displays a color-coded emotion wheel.

    • Each segment represents a specific emotion such as Joy, Anger, Trust, Fear, Surprise, Sadness, Disgust, and Anticipation, along with nuanced variations like Serenity, Rage, Admiration, and Amazement.

    • Active emotions — those the character is currently experiencing — are highlighted more brightly on the graph.


    hashtag
    Practical Use Cases

    • Character Testing: Observe real-time emotional responses to verify that the character reacts as intended.

    • Personality Tuning: Adjust personality traits in the Personality Traits section and see how they influence emotional patterns.

    • Storytelling & Roleplay: Ensure emotional consistency in interactive narratives.


    hashtag
    Conclusion

    The State of Mind feature offers valuable insights into your AI character’s emotional behavior. By monitoring these live emotional changes, you can ensure your character responds in a way that aligns with its designed personality and intended use case.

    Memory

    Learn how to use the Memory feature to review past sessions, manage conversation history, and enable long-term memory for your character.

    hashtag
    Introduction

    The Memory section lets you review conversation history for a character and decide whether it should remember information between sessions. Use it to audit interactions, troubleshoot issues, and enable persistent preferences.

    hashtag
    Recent Memory

    This tab lists all previous sessions with your character. For each session, you can view:

    • Date – The date when the session occurred.

    • Time – The session’s start time, shown in UTC time zone.

    • Session ID – A unique identifier for that session.

    circle-info

    If you experience any issues in a session, support team may request the Session ID so they can investigate in detail.

    Available Actions:

    • View conversation: Click the downward arrow to expand and see the conversation log for that session.

    • Copy or download: Use the three-dot menu on the right to copy or download the session data.


    hashtag
    Memory Settings

    This tab allows you to enable or disable Long Term Memory.

    When Long Term Memory is enabled, your character can remember preferences, choices, and facts from previous sessions. For example: If you tell your character “My favorite color is blue” in one session, and later in a different session ask “What’s my favorite color?”, the character will respond with “blue.”

    When disabled, the character will not retain information between sessions.


    hashtag
    Conclusion

    The Memory feature provides powerful control over how your character interacts with you over time. Use Recent Memory to inspect and share specific sessions, and adjust Memory Settings to decide whether your character should retain knowledge across conversations.

    Publish

    Learn how to publish and share your Convai Experience with the public, selected users, or embed it on your own website.

    hashtag
    Introduction

    The Publish page allows you to share your fully created and customized Convai Experience with the world or with a selected group of people. From here, you can configure the title, description, thumbnail, and visibility settings for your experience, as well as generate links for sharing.


    hashtag
    Publishing Options and Visibility Settings

    The Details tab contains all the essential settings for publishing your experience:

    • Experience Link – A direct link to your experience for easy sharing.

    • Experience Name – The display name for your published experience.

    • Experience Description – A short summary describing your experience.


    hashtag
    Embed Experience

    The "Embed Experience" tab allows you to embed Convai Experiences directly into your own website. This feature makes it easy to integrate interactive experiences into custom platforms or applications.

    triangle-exclamation

    Convai Pixel Streaming Embed is currently accessible only with the Professional Plan and above.


    hashtag
    Conclusion

    The Publish page provides all the tools you need to control how your experience is shared, whether you want it available to the public, only to select individuals, or embedded directly into your website. By choosing the right visibility settings, you can ensure your experience reaches the right audience in the right way.

    Customizing Your Avatar

    Learn how to visually and behaviorally personalize your Convai avatar using the Avatar Studio configurator.

    hashtag
    Overview

    Once your character is created, you can start customizing how they look, move, and interact using the Avatar Studio.

    hashtag
    What You Can Customize

    Here’s what you can do inside the Avatar Studio:

    • Choose a Sample Avatar,

      Pick from a library of ready-to-use, high-fidelity avatars.

    • Customize Appearance Modify facial features, clothing, hairstyles, and other visual elements to reflect your character’s identity.

    • Upload Your Own Avatars Prefer a unique design? Upload your own 3D avatar models for full control over their look.

    After you finish customizing, simply save and publish your avatar to bring it into your character’s conversations.

    Configure Avatar

    Learn how to choose, customize, or upload avatars in Avatar Studio.

    Once you enter the Avatar Studio, you can define exactly how your avatar looks and behaves. Here’s how to get started:

    hashtag
    Choosing a Sample Avatar

    You can start by selecting a high-quality Sample Avatar from Convai’s library. These are designed to cover a wide range of character types and use cases.

    • Browse the available avatars by scrolling through the list.

    • Click on the one you want to use.

    • That avatar will be instantly applied to your character.

    hashtag
    Creating a Custom Avatar

    If you want something more unique, you can create a custom avatar using the built-in editor.

    1. Go to the “Craft your own” tab.

    2. Click the plus icon (+) to start creating a new custom avatar.

    3. Give your avatar a name.

    This allows for highly personalized avatars that align with your branding and narrative needs.

    hashtag
    Uploading Your Own Avatar

    If you'd like to upload a custom avatar model, please follow our detailed guide here:

    Face Filter

    Use the Face Filter feature to make your avatar resemble a specific person based on a photo.

    triangle-exclamation

    The Face Filter feature is available only with the Scale plan and above.

    hashtag
    What is Face Filter?

    Face Filter allows you to personalize your avatar’s appearance to look like a specific person using a reference photo.

    hashtag
    How to Use It?

    hashtag
    1. Enable Face Filter

    Toggle on the Face Filter option inside the avatar customization panel.

    hashtag
    2. Upload an Image

    Click “Upload your own image” to add a photo reference.

    hashtag
    3. Apply the Image

    Select the uploaded image by clicking on it. Your avatar’s face will automatically morph to resemble the person in the photo.

    hashtag
    4. Manage Images

    • To delete an image, click on it and select “Delete image”.

    • You can upload multiple images to try different looks.

    With Face Filter, you can achieve even more lifelike, personalized characters — perfect for storytelling, training simulations, or representing real individuals in virtual settings.

    Environment

    Choose from immersive 3D and Solid environments to place your avatar in the right setting.

    hashtag
    Bring Your Character to Life with the Right Setting

    Selecting an environment helps anchor your avatar in a scene that matches your use case — whether it’s professional, playful, or futuristic.

    You can choose from a variety of immersive 3D and Solid environments, including:

    • A sleek, modern office

    • A sci-fi room with futuristic vibes

    • A warm and inviting cozy lounge

    • A minimal and practical kiosk-style backdrop

    These environments serve as the visual context for your avatar’s interactions, making conversations feel more realistic and engaging for your audience.

    circle-check

    Match the environment with the personality or purpose of your character.

    Convai Sim Experiences

    Create AI-powered avatars and deploy them in interactive 3D environments—directly from your browser.

    hashtag
    Introduction

    Convai Sim is a browser-based platform that allows you to instantly create and deploy AI-powered avatars in interactive 3D environments— no downloads, and no complex setup required.

    Designed for creators, educators, and developers, it enables rapid prototyping and deployment of lifelike characters inside rich, responsive scenes.


    hashtag
    What You Can Do

    With Convai Sim, you can:

    • Add one or more AI-powered avatars into a 3D scene

    • Set up real-time interactions using voice or text

    • Deploy avatars with smart navigation and context-aware behaviors


    hashtag
    Who It’s For

    Convai Sim is perfect for:

    • Educators and trainers building interactive simulations or learning environments

    • Storytellers and creators wanting to bring characters to life in immersive scenes

    • Game developers prototyping scenarios and NPC interactions

    Whether you're designing a futuristic training program or a playful game level, Convai Sim makes it easy to bring intelligence and interactivity to 3D worlds.


    hashtag
    Key Features

    • Browser-Based Platform No installations — launch and edit in-browser.

    • Multi-Avatar Support Add and manage multiple intelligent characters in a single scene.

    • High-Quality Visuals Use expressive avatars for rich storytelling and realistic simulation.

    Avatar Customization

    Fine-tune your deployed avatar’s appearance, size, and position within your 3D simulation scene

    hashtag
    Refine Your Avatar for a Perfect Fit

    Once you've placed your avatar into the scene, it's time to customize its model, pose, and placement to match your simulation's tone and context.


    hashtag
    Customizing Your Avatar

    Follow these steps to adjust your avatar visually using built-in tools:

    hashtag
    1. Select and Open the Character Tools

    • Click directly on your avatar in the scene.

    • This will open the transform tools.

    hashtag
    2. Customize Position, Rotation, and Scale

    You can adjust your avatar’s placement and appearance using either visual tools or precise numeric fields:

    hashtag
    Option A – Use Transform Gizmos

    • Move Tool: Drag the avatar along the XYZ axes.

    • Rotate Tool: Use the blue ring to turn the avatar’s facing direction.

    • Scale Tool: Resize the avatar by dragging the top cube handle.

    hashtag
    Option B – Use the Edit & Publish Panel

    • When the avatar is selected, the Edit & Publish panel appears.

    • Manually enter values for:

      • Position

    This option is ideal when you need precise alignment, consistency across avatars, or exact placement within complex scenes.


    With both intuitive drag-and-drop controls and precision inputs, customizing your avatar's presence in the scene is flexible and efficient.

    Next up: Let’s bring your avatar to life with tour-guide behaviors and intelligent interactions!

    Convai XR Animation Capture App Setup

    Learn how to install and connect the Convai XR Animation Capture App on your Meta Quest headset to start recording avatar animations in VR.

    hashtag
    Requirements

    Before you begin, make sure you have the following:

    • Meta Quest 2 / 3 / Pro

    • A registered Convai account –

    • A stable internet connection


    hashtag
    Installation Steps

    hashtag
    Step 1: Install the App on your Quest device

    1. Put on your Meta Quest headset.

    2. Open the Meta Quest Store.

    3. Search for "Convai Animation Capture".


    hashtag
    Step 2: Log In to Your Convai Account

    1. Launch the Convai Animation Capture app on your headset.

    2. When prompted, log in to your Convai account.


    hashtag
    You're Ready to Animate!

    After completing the steps above, your setup is complete. You can now begin recording animations directly in VR, which your AI avatars can intelligently perform in simulations, scenes, or guided experiences within Convai Sim.

    Getting Started

    Get the Convai Unity SDK installed, configured, and verified with a quick conversation test.

    hashtag
    Introduction

    The Convai Unity SDK lets you bring real-time conversational AI into your Unity projects—ideal for NPC dialogue, voice interactions, and interactive character experiences.

    This Getting Started section is designed to guide you from a fresh Unity project to a successful “first conversation” using either a sample scene or your own custom scene.

    hashtag
    Overview

    Use these pages depending on where you are in the process:

    • Installation

      • UPM (Recommended) — easiest updates and dependency management

      • Unity Asset Store — Asset Store based distribution workflow (Coming soon)

    hashtag
    What’s next

    If this is your first time installing Convai in Unity:

    1. Start with Install via UPM

    2. Continue to Configure API Key

    3. Validate with Sample Scenes or Custom Scene Setup

    hashtag
    Conclusion

    You now have a clear path to install and validate the Convai Unity SDK. Start with Installation, then move to Setup, and finish with a quick test conversation.

    circle-info

    Need help? For questions, please visit the .

    Install via UPM (Recommended)

    Install the Convai Unity SDK via the Unity Package Manager using the package name.

    hashtag
    Introduction

    UPM installation is the recommended approach because it’s easy to maintain, update, and keep consistent across a team.

    hashtag
    Prerequisites

    • A Unity project opened in the Unity Editor

    hashtag
    Step-by-step

    1

    hashtag
    Open Package Manager

    In Unity, go to Window → Package Manager.

    2

    hashtag
    Troubleshooting

    • Console errors after install

      • Confirm you are using a supported Unity version.

    hashtag
    Conclusion

    You’ve installed the Convai Unity SDK via UPM and confirmed the editor compiled successfully. Next, go to Setup → Configure API Key.

    circle-info

    Need help? For questions, please visit the .

    Setup

    Configure credentials and choose how you want to test Convai: samples or your own scene.

    hashtag
    Introduction

    After installation, you’ll configure your Convai API key and then validate the integration using either:

    • Convai’s Sample Scenes, or Your own Custom Scene Setup

    hashtag
    Overview

    • Configure API Key (required)

    • Import & Run Sample Scenes (fastest validation)

    • Custom Scene Setup (integrate into your scene)

    • Add Chat UI (optional, text input + transcript)

    • Add Lip Sync to Your Character (optional, real-time facial animation)

    hashtag
    Recommended path

    1. Configure API Key

    2. Import and run a sample scene

    3. (Optional) Set up a custom scene

    4. (Optional) Add Chat UI

    hashtag
    Conclusion

    You’re ready to configure your project and run your first conversation test. Start with Configure API Key.

    circle-info

    Need help? For questions, please visit the .

    Disable Assembly Validation

    If you ever get an error that looks like this, disable the Assemble Version Validation in Project Settings > Player > Other Settings.

    Assembly 'Assets/Convai/Plugins/Grpc.Core.Api/lib/net45/Grpc.Core.Api.dll' will not be loaded due to errors: 
    Grpc.Core.Api references strong named System.Memory Assembly references: 4.0.1.1 Found in project: 4.0.1.2.

    Ensure that Assembly Validation is disabled in Project Settings > Player > Other Settings.

    Restart the Unity project after unchecking the box should fix the issue.

    Animations have Facial Blendshapes

    Resolve facial blendshape issues in Unity animations with Convai. Improve character realism.

    If the Lip-sync from characters are either not visible or are very faint, if could be a result of character's animations overriding the blendshape changes made by the script. We recommend deleting the relevant components in the animation dopesheet.

    The blendshapes in the CC_Base_Body's Skinned Mesh Renderer. We shall delete these.

    Default Animations Incompatibility

    Fix default animation incompatibilities in Unity with Convai. Ensure smooth AI character animations.

    If the default animations that ship with the animator look bugged such that the hand seems to intersect with the body, it could indicate an issue with the wrong animation avatar being selected.

    You can easily fix that by heading to the character's animator component and assigning the correct animator to the Avatar field.

    For male avatars
    For female avatars

    The correct animation will look something like this. The hands should not intersect the body.

    Adding Scene Reference and Point-At Crosshairs

    You can point at Interactable Objects and Characters and ask your characters about them.

    To enable this, simply drag and drop the Convai Crosshair Canvas prefab into the scene.

    Pre-Requisites

    Review the prerequisites for integrating Convai with Unity. Ensure seamless setup and functionality.

    hashtag
    Unity Version

    circle-exclamation

    The Convai Unity SDK supports a minimum of Unity 2022.3.x or later.

    circle-info

    You should have Git installed locally on your system.

    hashtag
    Skills and Knowledge

    Before integrating the Convai SDK, you should be comfortable with the following:

    1. Importing Packages: Know how to import external packages into a Unity project.

    2. Unity Editor: Be proficient in navigating the Unity Editor interface.

    3. Animations: Understand how to add and handle animations for assets.

    Having these skills will ensure a smooth integration and optimal use of the Convai Unity SDK in your projects.

    Personality Traits

    Learn how to customize your AI character’s personality using presets or manual trait adjustments.

    hashtag
    Introduction

    The Personality Traits section defines how your AI character behaves, interacts, and responds during conversations. By adjusting personality parameters, you can align the character’s behavior with its intended role, making interactions more engaging and consistent.


    Avatar Studio Experiences

    Create intelligent 3D AI avatars directly in your browser — no downloads, no code, fully customizable.

    hashtag
    Introduction

    Convai’s Avatar Studio is a user-friendly platform that allows anyone to create intelligent, high-quality 3D conversational avatars — right from your web browser.


    Animation & Expression Settings

    Customize your avatar’s expressiveness with facial and body animations, emotions, and intelligent actions.

    hashtag
    Make Your Avatar Come Alive

    Convai’s Avatar Studio lets you fine-tune how expressive your avatar is — from subtle facial expressions to full-body gestures and smart actions.

    hashtag

    Publishing an Experience

    Learn how to publish and share your customized avatar experience for use across web, kiosks, apps, and more.

    hashtag
    Ready to Share Your Experience with the World?

    Once your character and avatar setup is complete, you can publish your experience directly from the Convai Character Creator dashboard.

    circle-check

    Creating Your AI Simulation with Convai Sim

    Bring your Convai characters to life by placing them into 3D interactive environments using Convai Sim

    hashtag
    Introduction

    Now that you’ve created a Convai character, it’s time to place them into a 3D simulation. With Convai Sim, you can bring characters to life inside immersive environments—fully interactive and embodied in high-quality avatars.


    Publishing an Experience

    Learn how to finalize and publish your AI simulation or tour guide experience created with Convai Sim

    hashtag
    Make Your Experience Live

    Once you’ve finished building your AI simulation or virtual tour, Convai Sim makes it easy to publish and share your experience across platforms.

    Convai XR Animation Capture App

    Capture animations in VR using your Meta Quest and animate AI avatars—no mocap suit required.

    hashtag
    Introduction

    The Convai XR Animation Capture app allows you to record high-quality animations directly in virtual reality using a Meta Quest headset. These animations can be uploaded to your Convai account and used seamlessly across platforms like Unity, Unreal Engine, or within no-code tools like Avatar Studio and Convai Sim.


    Adding Your Recorded Animations to AI Avatars Inside Unity

    Learn how to import animations recorded in VR and apply them to your AI avatars in Unity.

    hashtag
    Overview

    Bring your Convai avatars to life inside Unity by integrating animations recorded via the Convai XR Animation Capture App. This guide walks you through importing those animations and attaching them to AI-powered characters in your Unity project.

    Downloads

    Download Convai tools for Unity. Access the latest plugins and updates for AI integration.

    Version
    Features
    Download Link

    Creating a Convai Powered Scene from Template

    This guide will help you make a scene in unity with Convai Essentials already present in it. It will help you to get started with our plugin very fast.

    hashtag
    Step 1) Open the New Scene window

    You can open the new scene window by two ways, first by pressing Ctrl + N for windows or CMD + N for Mac on your keyboard, second way is to navigate to File -> New Scene

    Player Data Container

    All the information that Convai SDK needs from the player to work properly

    This is a scriptable object which is made automatically after you hit play in the editor with Convai SDK installed and in a Scene where Convai Base Scene Essentials Prefab is present

    Default Player Name

    You can provide a default name of your players.

    Player Name

    Current name of your player, out of the box if you use our settings panel, we keep it updated automatically, if you are using some custom logic, it will be your responsibility to keep it updated, as our transcript UI use this name to show it in UI

    Speaker ID

    Unity Plugin (Beta) Overview

    Discover the all-new Convai Unity Plugin Beta — redesigned from the ground up for faster, more immersive, and hands-free AI character experiences in Unity.

    hashtag
    Introduction

    The Convai Unity Plugin (Beta) marks a major leap forward in how developers can bring conversational AI to life inside Unity. Built entirely from the ground up with a new backend and plugin infrastructure, this release delivers a faster, lighter, and more powerful experience for real-time character interactions.

    Every aspect of the plugin has been re-engineered based on extensive developer feedback from our previous version — focusing on performance, ease of use, and seamless integration with modern Unity workflows.


    Language Support

    Convai offers comprehensive transcript and voice support for a wide range of languages. To facilitate seamless integration, our Unity plugin comes with a custom TextMeshPro (TMP) package, which includes essential fonts and required settings for major languages.

    circle-exclamation

    This requires TMP Essentials pre-installed, which can be done through the TextMeshPro option in the Window tab or through a prompt on starting the project.

    Microphone Permission Issues

    Resolve microphone permission issues in Unity with Convai. Ensure smooth voice interactions.

    If you see the microphone indicator turning on in the top left corner but no user transcript in the chat UI and the character's response doesn't seem coherent to what you said, then it is likely that the game or Unity is not accessing the correct microphone or does not have sufficient microphone privilege. To fix this, please follow along.

    Narrative Design Keys

    This guide shows how to dynamically pass variables to the Narrative Design section and triggers.

    We will create a simple scenario where the character welcomes the player and asks them about their evening or morning based on the player's time of day.

    hashtag
    Step 1

    Activate the Narrative Design for your character in the Playground. Then, create a new Section.

    Jaw Bone in Avatar is not Free

    Fix jaw bone issues in Unity avatars with Convai. Ensure smooth lip sync and animations.

    If the Lip Sync does not seem to cause any facial animations, even after removing all blendshapes from animations, then the following steps should help resolve the issue.

    circle-info

    This is a known issue in Reallusion CC4 characters.

    Select the Character and head to the Animator component.

    Dynamic Information Context

    The Dynamic Information feature enables you to pass variables to NPCs in real time, allowing them to react dynamically to changes in the game environment. This can include the player’s current health, inventory items, or contextual world information, greatly enhancing interactivity and immersion.

    hashtag
    Step-by-Step Guide to Setting Up Dynamic Config

    First, Add the Dynamic Info Controller Component to your Convai NPC.

    Unity Plugin

    Integrate advanced conversational AI to create intelligent, interactive NPCs for your games.

    hashtag
    Overview

    Convai's Unity SDK provides you with all the tools you need to integrate conversational AI into your Unity projects. Convai offers specialized NLP-based services to build intelligent NPCs for your games and virtual worlds. Our platform is designed to seamlessly integrate with your game development workflow, enhancing the interactivity and depth of your virtual environments.

    Configure API Key

    Add your Convai API key in Unity to enable the SDK.

    hashtag
    Introduction

    The SDK needs your Convai API key to authenticate requests and enable character conversations.

    hashtag

    Limitations of WebGL Plugin

    Understand the limitations of the WebGL plugin for Unity with Convai. Optimize your development.

    hashtag
    Size Constraints

    iOS browsers impose strict limitations on the size of WebGL builds. These constraints are primarily due to:

    • Memory Limits: iOS devices have limited available memory for web applications, which can affect the performance and feasibility of running large WebGL builds.

    Building For Supported Platforms

    With Convai's Unity SDK, you can build your favorite application for several platforms, including Windows, MacOS and Android. Currently, we also support these platforms:

    Easily position characters using drag-and-drop scene editing
  • Instantly publish your experience for testing or deployment

  • Run everything directly in your browser.

  • Enterprises creating training, onboarding, or customer-facing virtual flows
  • Tourism and museum teams looking for guided, avatar-led experiences

  • Instant Deployment Launch scenes immediately and preview interactions with one click.
  • Intelligent Navigation Characters move contextually, ideal for tour guide or training scenarios.

  • Interactive Scene Editing Easily arrange avatars and elements using drag-and-drop tools.

  • Versatile Use Cases Perfect for education, training, tourism, gaming, and more.

  • Programming in C#: Have a basic experience programming Unity scripts in C#.
  • Script Integration: Be capable of adding scripts to a game object.

  • Building and Deployment: Know how to build and deploy an application to your chosen platform.

  • hashtag
    What’s New

    This Beta introduces a wide range of improvements designed to make AI character integration smoother, faster, and more natural than ever before:

    • Hands-Free Conversations — Enjoy uninterrupted, natural dialogue without manual push-to-talk inputs.

    • Low Response Time — Experience significantly reduced latency for more fluid and realistic exchanges.

    • Voice Activity Detection (VAD) — Automatically detect when a user is speaking, creating smoother conversational flow.

    • New Convai Plugin Architecture — Optimized for scalability, extensibility, and future updates.

    • Lightweight Package Size — The plugin dynamically fetches cloud resources as needed, keeping your project lean.

    Together, these updates make building intelligent, interactive worlds with Convai characters easier and more efficient than ever before.


    hashtag
    Beta Release

    This is the Beta release of the new Convai Unity Plugin. We’ll be rolling out frequent updates to improve stability, performance, and feature coverage as we move toward the full release.

    Your feedback plays a critical role in shaping this development. We encourage you to share your thoughts, experiences, and suggestions directly on the Convai Developer Forumarrow-up-right.


    hashtag
    Conclusion

    The Convai Unity Plugin (Beta) represents the next evolution of AI-driven interactivity in Unity — blending natural voice, low-latency responses, and seamless integration into one unified framework. Start exploring, experiment with new features, and help us shape the future of interactive AI experiences.

    Browser Storage Quotas: Safari and other iOS browsers restrict the amount of data that can be stored locally. This includes caching and Indexed DB, which are often used to store assets for WebGL builds.

    hashtag
    Key Limitations

    • Maximum Downloadable Asset Size: iOS browsers may restrict the size of individual downloadable assets. Large assets might fail to load, causing the application to break.

    • Total Build Size: The total size of all assets combined should ideally be kept under 50-100 MB for smooth performance. Exceeding this limit can lead to crashes or extremely slow loading times.

    • Memory Usage: iOS devices typically have less RAM available compared to desktop environments. High memory usage by WebGL builds can result in frequent browser crashes.

    hashtag
    Browser Compatibility

    • Safari: The default browser on iOS, Safari, is generally the best option for WebGL builds, but it still has significant limitations compared to other desktop browsers.

    (Optional) Add Lip Sync to Your Character

    Configure API Keychevron-right
    Import & Run Sample Sceneschevron-right
    Custom Scene Setupchevron-right
    Add Chat UI (Transcript UI)chevron-right
    Convai Developer Forumarrow-up-right

    Augmented Reality (Android/iOS)

  • WebGL

  • Universal macOS applications

  • iOS/iPadOS
    Virtual Reality
    Mixed Reality
    hashtag
    Check Notification System page to learn more about various issues related to microphone
    The microphone icon on the top left is on, indicating that mic is listening
    Select proper microphone device from the drop down list
    Click on Record to make sure microphone is recording properly
    Press Stop to listen to the audio recorded through the selected mic
    The snapshot appears in the list
    ElevenLabs Voice Integration Documentationarrow-up-right

    Emotions change dynamically based on the context, tone, and content of the conversation.

    Typical uses
    • Keep a stable production version while experimenting with a new Core Description or personality.

    • Prepare variations for different audiences or channels.

  • Good practice

    • Give versions clear names and short notes such as “v1.2 retail tone” or “v2.0 multi language test”.

    • Save a version before major edits or before handing the character to a teammate.

  • : everything is already saved.
  • Important

    • If you refresh or navigate away while the button is green, your unsaved changes will be lost.

    • Click Update after edits on any tab, then proceed to testing.

  • Consider cloning for archival instead of deleting.
    Character Versioning
    Size – File size.
  • Status – Indicates if the file is available.

  • Actions available:

    • Connect – Attach the file to a character.

    • Disconnect – Remove the file from a character.

    • Edit – Modify the file content.

    • Download – Save the file locally.

    • Delete – Remove the file permanently.

  • Ask the same question again.
    Pricingarrow-up-right
    Thumbnail – By default, this uses the selected environment’s image, but you can upload a custom thumbnail.
  • Visibility Settings – Controls who can see and interact with your experience:

    • Public – Your experience can be discovered and accessed by anyone on x.convai.com.

    • Unlisted – Only users with the direct link can access the experience.

    • Private – The experience is restricted to invited users only.

      • When set to Private, the Share Privately button becomes active. Here you can:

        • Enter the email address of the person you wish to invite.

  • Set Up Intelligent Animations Configure gestures like waving, nodding, reacting, and thinking — all triggered contextually during conversations.

  • Select a Virtual Environment Place your avatar in immersive digital scenes that match the tone and use case of your experience.

  • Adjust Interaction Behavior Fine-tune how your avatar communicates — such as their speaking style, tone, and user engagement preferences.

  • Device & Branding Adaptation Customize how the avatar interface behaves across different devices and align it with your brand’s visual identity.

  • Customize features such as:
    • Facial features

    • Hairstyle & hair color

    • Skin tone & texture

    • Outfits & accessories

    • Age appearance

  • Under the “Brand” section:

    • Use sliders to apply your logo on supported clothing items.

    • Make sure “Display logo on cloth” is enabled under Interface Settings.

  • Uploading Avatarschevron-right
    Rotation
  • Scale

  • Select the app and click Install.
    Sign up herearrow-up-right

    Setup

    • Configure API Key

    • Import & Run Sample Scenes

    • Custom Scene Setup

    • Add Chat UI (Transcript UI)

    • Add Lip Sync to Your Character

    Optionally add Chat UI
  • Optionally add Lip Sync for real-time facial animation

  • Convai Developer Forumarrow-up-right
    hashtag
    Add the package from Git URL
    • Click the + button (top-left).

    • Select Install package by name

    • Copy the package name below and paste it into the Package Name field. Then click Install.

      • com.convai.convai-sdk-for-unity

    3

    hashtag
    Verify the installation

    • Wait for Unity to finish importing and compiling.

    • Open Console (if needed): Ctrl + Shift + C (Windows) / Cmd + Shift + C (macOS)

    • Expected result: No errors in Console, and a Convai menu appears in the top toolbar.

    Convai Developer Forumarrow-up-right
    hashtag
    Preset Personality Styles

    At the top of the page, you’ll find a dropdown menu containing predefined personality presets:

    • Adventurous Thinker

    • Friendly Optimist

    • Harmonious Empath

    • Analytical Perfectionist

    • Curious Mediator

    • Energetic Dreamer

    • Social Adventurer

    • Compassionate Idealist

    Selecting a preset automatically adjusts the character’s personality traits to match the chosen style.


    hashtag
    Customizing Personality Traits

    If you prefer full control, you can manually adjust the vertical sliders for each personality dimension:

    1. Openness

      • High value: Likes exploring and trying new things.

      • Low value: Prefers stability and routine.

    2. Meticulousness

      • High value: Pays great attention to detail.

      • Low value: More relaxed and spontaneous.

    3. Extraversion

      • High value: Outgoing and sociable.

      • Low value: Reserved and introverted.

    4. Agreeableness

      • High value: Cooperative and empathetic.

      • Low value: More competitive and independent.

    5. Sensitivity

      • High value: Highly emotional and expressive.

      • Low value: Rarely emotional or reserved.

    Each slider ranges from 0 to 4, allowing precise adjustments to match your character’s personality profile.


    hashtag
    Visual Personality Map

    Below the sliders, a radar chart displays a visual representation of the character’s personality. This helps you see how each trait contributes to the overall personality balance.


    hashtag
    Conclusion

    The Personality Traits section gives you the flexibility to either choose from predefined styles or fine-tune individual traits to create a personality that matches your vision. By combining these settings with your character’s description and voice, you can create truly distinctive AI personas.

    hashtag
    What You Can Do

    With a simple interface, you can design and deploy fully interactive avatars that:

    • Speak and respond via voice and text

    • Perform intelligent animations

    • Adapt to different virtual environments

    • Are fully customizable

    • Optionally use vision-based input to "see" the user and react with natural, personalized responses, enhancing realism


    hashtag
    Who It’s For

    Avatar Studio is perfect for:

    • Creators and developers building digital characters

    • Educators creating engaging learning experiences

    • Brands looking to enhance digital events

    • Game designers needing lifelike NPCs

    • Anyone interested in AI-powered interactive storytelling

    Whether you're creating an NPC for a game or a digital host for a virtual event, Convai’s Avatar Studio helps you bring your characters to life—quickly and easily.


    hashtag
    Key Features

    • Conversational AI Avatars engage in natural, human-like voice or text conversations.

    • High-Quality Metahuman NPCs Realistic 3D Metahuman avatars with high-fidelity lip-sync, natural eye-blinking, and intelligent animations.

    • Runs Entirely on the Browser No downloads, installations, or GPU power needed — just open and start creating.

    • Intelligent Actions and Animations Avatars react with gestures such as waving, thinking, and expressing emotions based on the conversation context.

    • Proactive & Agentic AI Characters Characters can initiate conversations and act autonomously in response to their environment.

    • Vision-Based Interaction Avatars can perceive users via camera input and respond with contextually appropriate and human-like reactions.

    • High-Quality Backgrounds Choose from immersive environments to place and enhance your characters.

    • Total Customization Fully personalize the avatar’s appearance, voice, actions, environments, branding and more.

    Facial & Body Animation

    Use sliders to define the intensity of animations:

    • Facial Animation

      • Range: -1 to 1

      • Lower values result in minimal expressiveness, while higher values make your avatar more emotionally responsive.

    • Body Animation

      • Range: Low to High

      • A low setting keeps the avatar more static, while high adds dynamic hand and body movements for livelier interaction.

    hashtag
    Initial Facial Expression

    • Camera Focus Toggle Enable or disable eye contact with the user by toggling camera focus.

    You can define how your avatar appears at the start of an interaction:

    • Enable or Lock Expressions Use toggles to either:

      • Allow expressions to change during conversation

      • Lock the avatar into a specific expression

    • Select an expression from the dropdown:

      • Joy

      • Trust

      • Fear

      • Surprise

    hashtag
    Custom Actions

    Give your avatar intelligent behaviors during interactions — like waving hello or thinking.

    How to Add a Custom Action:

    1. Click “Add a new action”.

    2. Toggle Eye Focus on or off.

    3. Click “Select animation”.

    4. Choose from available animations (e.g., Wave Animation for greeting).

    5. Name your action (e.g., Waves Cheerfully).

    6. Click “Preview Animation” to test how it looks.

    Before You Begin: Make sure your avatar is saved and the character is created before navigating to the Publish tab.

    hashtag
    Publishing Steps

    hashtag
    1. Go to the Publish tab inside your character’s dashboard.

    hashtag
    2. Finalizing Your Experience

    Fill in the necessary details to define and present your simulation:

    • Experience Name e.g., Virtual Tour of the Fire Station

    • Experience Description e.g., Get a deeper look and understanding of the inner workings of a fire station with your virtual tour guide Lina!

    • Thumbnail (Optional) Upload an image to visually represent your experience.

    hashtag
    3. Choose Visibility Settings

    Select how and with whom the experience should be shared:

    • Public

      • Visible to everyone

      • Accessible on x.convai.comarrow-up-right

    • Private

      • Only visible to you and invited users

    • Unlisted

      • Not listed publicly, but can be accessed via a direct link

    • Embed on Your Site (Enterprise-only)

      • Publish your experience directly to your own website

    triangle-exclamation

    Convai Pixel Streaming Embed is currently accessible only with the Enterprise plan. To learn how to embed an avatar into your own platform, check out the Embedding Documentation.


    hashtag
    After Publishing

    Once published, your experience is ready to be deployed on:

    • Websites

    • Applications

    • Kiosk systems

    • Any supported digital platform


    Convai Pixel Streaming Embedchevron-right
    hashtag
    Getting Started

    Follow this step-by-step guide to launch your first AI-powered simulation using Convai Sim.

    hashtag
    1. Access the Playground

    Go to convai.comarrow-up-right and log into your account. Navigate to the Playground section from the dashboard.

    hashtag
    2. Create a New Experience

    Click on “Create a new experience” to begin setting up your simulation.

    hashtag
    3. Choose an Environment

    Select an environment that fits your use case (e.g., office, museum, sci-fi room). Then click “Start Experience” to enter the Convai Sim.

    hashtag
    4. Explore the Scene

    Once the scene loads:

    • Use WASD keys to move around.

    • Use your mouse to look around the environment — just like in a first-person game.

    hashtag
    5. Add an Avatar

    Click the top-left icon to open the avatar menu. Then:

    • Click “Add Avatar”.

    • A hologram will appear — place it at the desired location in the scene.

    hashtag
    6. Select Your Character and Avatar

    You’ll be prompted to:

    • Choose your previously created Convai character.

    • Select a Metahuman avatar to visually represent that character.

    hashtag
    7. Deploy the Character

    Click “Deploy Character” to spawn the avatar into the environment. The avatar will now be active and ready to interact.

    hashtag
    8. Add More Avatars (Optional)

    Repeat the process to add multiple characters into the same scene and create more dynamic simulations.


    hashtag
    Summary

    You’ve now:

    • Created a Convai character

    • Selected a 3D environment

    • Embodied your character in a lifelike avatar

    • Brought them into an interactive simulation

    With multi-avatar support, you can quickly build rich, AI-driven experiences—from training simulations and virtual tours to interactive stories and games.

    Next, we’ll explore how to customize your avatars and scenes using the available tools.

    hashtag
    Publishing Steps

    hashtag
    1. Finalizing Your Experience

    Fill in the necessary details to define and present your simulation:

    • Experience Name e.g., Virtual Tour of the Fire Station

    • Experience Description e.g., Get a deeper look and understanding of the inner workings of a fire station with your virtual tour guide Lina!

    • Thumbnail (Optional) Upload an image to visually represent your experience.

    hashtag
    2. Choose Visibility Settings

    Select how and with whom the experience should be shared:

    • Public

      • Visible to everyone

      • Accessible on x.convai.comarrow-up-right

    • Private

      • Only visible to you and invited users

    • Unlisted

      • Not listed publicly, but can be accessed via a direct link

    • Embed on Your Site (Enterprise-only)

      • Publish your experience directly to your own website

    triangle-exclamation

    Convai Pixel Streaming Embed is currently accessible only with the Enterprise plan. To learn how to embed an avatar into your own platform, check out the Embedding Documentation.


    hashtag
    What Happens After Publishing?

    Once published, your experience becomes:

    • Accessible to your intended audience

    • Ready for interaction via web, kiosk, or internal use

    • Shareable as a training tool, educational demo, or digital showcase

    Whether you're running a public-facing simulation or a private module for internal teams, Convai Sim gives you complete control over how your AI-driven experience is distributed.

    hashtag
    What You Can Do

    With the Convai XR Animation Capture App, you can:

    • Record natural animations in VR using your Meta Quest

    • Upload animations directly to your Convai account

    • Assign these animations to AI avatars, which perform them intelligently during conversation

    • Use animations in:

      • Unity

      • Unreal Engine

      • Convai Sim

    • Build custom gesture libraries and animation sets → All without the need for mocap suits or external trackers


    hashtag
    Who It’s For

    This app is ideal for:

    • Developers & creators building immersive and interactive characters

    • Educators & trainers crafting virtual learning environments

    • Game designers enhancing NPC realism in Unity or Unreal

    • Brands & marketers creating engaging virtual hosts with Avatar Studio

    • Storytellers & world-builders designing no-code simulations with Convai Sim

    Whether you’re building a virtual assistant, NPC, tour guide, or performer — XR Animation Capture helps you bring your AI characters to life with natural, human motion.


    hashtag
    Key Features

    • VR-Based Animation

      Record gestures, motions, and actions naturally with your Meta Quest headset.

    • Direct Upload to Convai

      Animations are automatically synced to your Convai account—no manual transfer needed.

    • Cross-Platform Support

      Use animations in Unity, Unreal Engine, Convai Sim, and Avatar Studio — no extra setup required.

    • AI-Driven Animation Triggers

      Let your avatars perform animations intelligently based on dialogue and context.

    • No Mocap Suit Needed

      Capture high-quality animation using just your VR headset — no external trackers or suits required.

    • Works with No-Code Tools

      Deploy intelligent, animated avatars directly in browser-based platforms like Avatar Studio and Convai Sim.

    hashtag
    How to Add Recorded Animations to AI Avatars in Unity

    hashtag
    Step 1: Set Up Unity & Convai

    Before importing animations, ensure your Unity project is correctly set up with Convai:

    • Install the Convai Unity SDK.

    • Retrieve your API key from the Convai Playground.

    • Add the API key to your project settings.

    circle-check

    Need help setting up the SDK? Check out our Unity SDK Documentationarrow-up-right or follow the videoarrow-up-right walkthrough above.


    hashtag
    Step 2: Import the Animation

    1. Go to the Convai Dashboard and navigate to the Server Animations tab.

    2. Locate the animations you recorded in VR.

    3. Click Import and select a location within your Unity project's Assets/ directory.

    circle-exclamation

    The files must be placed inside the Unity project folder for them to be detected and used properly.


    hashtag
    Step 3: Apply the Animation to a Character

    1. In Unity, drag your AI character model into the scene hierarchy.

    2. Adjust the character’s position if needed.

    3. Open or create an Animator Controller.

    4. Drag the imported animation clip into the Animator Controller.

    5. If the animation should repeat, enable Loop Time in the Animation settings.


    hashtag
    Step 4: Test the Scene

    1. Run your Unity scene.

    2. Start a conversation with the avatar or trigger the assigned action.

    3. Watch your AI avatar perform the recorded animation in real-time!


    hashtag
    Done!

    You’ve now successfully connected a custom VR-recorded animation to an AI-powered avatar in Unity. Repeat the process to add more animations and create rich, expressive characters in your simulations or games.

    circle-info

    There are some limitations for WebGL version of the plugin, to learn about it, please go to Limitations of WebGL Plugin

    Unity Verified Solution

    This is the Long-Term Support version of our core version. It contains all the necessary tools for adding conversational AI to your characters.

    Download here.arrow-up-right

    WebGL

    This plugin version should be used if you need to build for WebGL. Please ensure that Git is installed on your computer prior to proceeding.

    hashtag
    Step 2) Select Convai Scene Template

    There will be many scene templates depending upon your project, but in this guide, we are interested in Convai Scene Template so select that and click on Create button.

    Screenshot showing how to create a scene from convai scene template

    hashtag
    Step 3) Save Created Scene

    You can now save the newly created scene in the project at your desired location by either pressing Ctrl + S on Windows or CMD + S on Mac. Another method is to navigate to File -> Save Scene

    Screenshot showing how to save the scene

    This is open the Save Scene Window, choose your desired location, for this demo we will save it inside Demo folder, but you can save it anywhere in the assets directory.

    Give your scene a name and then click on Save Scene button

    Screenshot showing save location of the new Convai powered scene

    Now you can import your Convai Character or your Custom Characters by following our complete guide on it

    Screenshot showing process of opening up New Scene Window in Unity
    Importing Ready Player Me (RPM) Characterschevron-right
    Importing Custom Characterschevron-right

    Speaker ID for the player. Please note that Speaker ID is directly linked with your API key, so for each API key there should be a unique speaker ID associated with it. We handle the creation of the Speaker ID when it's not found in the Player Prefs if the Boolean is set to true.

    Create Speaker ID If Not Found

    This Boolean lets the SDK if it should create a unique Speaker ID for that Player Name if it is not found in the Player Prefs.

    hashtag
    Buttons

    hashtag
    Reset Data

    It just makes the Player Name and Speaker ID fields empty.

    hashtag
    Copy Data

    Copies the data into system buffer so you can paste it anywhere for debugging purpose

    hashtag
    Player Pref Settings Button

    1. Load: Loads the Player Name and associated Speaker ID from the player Prefs

    2. Save: Saves the Player Name and associated Speaker ID from the player Prefs

    3. Delete: Deletes the Player Name and associated Speaker ID from the player Prefs

    hashtag
    How to maintain the Player Data

    Convai provides a pre-made component which you can add to any GameObject to make the PlayerDataContainer work out of the box.

    Choose an existing GameObject or create a new GameObject in the scene and add the ConvaiPlayerDataHandler component to your chosen GameObject and it should start working

    hashtag
    Optional Step

    You can also create the required Scriptable Object by going to Assets > Convai > Resources and right clicking in the project panel and navigating to Create > Convai > Player Data and name it ConvaiPlayerDataSO

    triangle-exclamation

    Make sure you name the created Scriptable Object exactly ConvaiPlayerDataSO as our system looks for this exact name

    hashtag
    Setup

    To implement these language-specific features in your project:

    1. Navigate to the Convai Setup Window within Unity.

    2. Locate the Package Management section.

    3. Click on the "Convai Custom TMP Package" button.

    Once installed, just import the character for which you require the language support, talk with it and the font will automatically render in the transcript.

    For now, we provide fonts for these languages:

    • Arabic

    • Japanese

    • Korean

    • Chinese

    hashtag
    RTL Support

    We also provide support for Right-to-Left languages, like Urdu, Persian and Arabic through our Chat UIs. So, for example, if you talk with an Arabic character or if the character's name is in Arabic, the text will automatically enable the RTL feature provided by unity to reflect proper transcripts.

    TMP Importer (will appear automatically if TMP Essentials are not imported)
    TMP Essentials Manual Import Process
    hashtag
    Step 2

    In the Objective section of the new Section, add the following text:

    The time of day currently is {TimeOfDay}. Welcome the player and ask him how his {TimeOfDay} is going.

    circle-exclamation

    Notice that any string placed between curly brackets becomes a variable. In this case, we are adding the time of day as a variable. From Unity, we can pass either the word "Morning" or "Evening," and the character will respond accordingly.

    hashtag
    Step 3

    Now, let’s back to Unity and make the necessary adjustments. Click on your NPC.

    Click the Add Component button and add the Narrative Design Key Controller Component.

    hashtag
    Step 4

    In the Name field, enter TimeOfDay. In the Value field, specify the corresponding value for that variable, which could be Morning, Evening, or anything else you choose.

    That’s it! Now let’s test it out. 🎉😎

    Click the Avatar Field once to select the character's avatar in the Project window.

    Select the Avatar and click Configure Avatar.

    Select the Head option in the Mapping tab.

    Select the Jaw Mapping and set it to None.

    Finally scroll down and click Apply.

    This will free the avatar's jaw mapping and allow the script to manipulate the Jaw bones.

    Create a new script or use an existing script to define a variable that will store a reference to the Dynamic Info Controller Component you added to your NPC.

    hashtag
    Example: Passing Player Health to the NPC

    • Initialize the Dynamic Info: In the script’s Start method, call the SetDynamicInfo method on the Dynamic Info Controller reference. This will set the dynamic information that the NPC will use. In this example, we’ll initialize the Player’s health as a dynamic variable.

    • Updating the Dynamic Info: Whenever you need to update the NPC with new information (such as a change in Player Health), call the SetDynamicInfo method on the Dynamic Info Controller.

    hashtag
    Sample Scenario

    • At the start of the game, we set the Player’s health to 100 and send this information to the NPC as the initial value.

    • Then, when the player takes damage (simulated here by pressing the "K" key), we reduce the Player’s health and update the Dynamic Info in real time so that the NPC remains aware of the Player's current health status.

    hashtag
    Example Conversation

    Below, we provide a sample conversation showcasing how the NPC can react based on the dynamic health information of the Player. By dynamically updating the Player's health, NPCs can deliver responses that feel personalized and relevant to the current gameplay.

    hashtag
    In summary

    Add the Dynamic Info Controller to your NPC. Use SetDynamicInfo to initialize the dynamic variable at the start, and call SetDynamicInfo again whenever updates are needed.

    This feature provides a powerful tool for creating NPC interactions that respond in real-time to the state of the game world, creating a more immersive experience for the player.

    hashtag
    Key Features
    • Conversational AI: Leverage advanced NLP capabilities to create NPCs that can understand and respond to player input in natural, engaging ways.

    • Intelligent NPCs: Build characters with dynamic dialogue and behaviors that adapt to player actions and the game world.

    • Easy Integration: Our SDK is designed for quick and simple integration with your Unity projects, allowing you to focus on creating compelling gameplay experiences.

    • Cross-Engine Support: In addition to Unity, Convai supports other popular game engines, ensuring broad compatibility and flexibility for your development needs.

    circle-info

    Download the Convai Unity SDK from Unity Asset Storearrow-up-right

    This is the Core version of the plugin. It has a sample scene for anyone to get started. This version of the plugin only contains the basic Convai scripts and Character Downloader.

    Visit convai.comarrow-up-right for more information and support.

    hashtag
    Quick Setup Tutorial

    Prerequisites
    • A Convai account

    • API key from your Convai dashboardarrow-up-right. (Where to find your API key)arrow-up-right

    hashtag
    Step-by-step

    1

    hashtag
    Open the Convai Account window in Unity

    In the Unity top menu, go to Convai → Account.

    2

    hashtag
    Copy your API Key from Convai

    • Open Convai in your browser and sign in.

    • Locate your API Key in the dashboard/profile settings.

    3

    hashtag
    Paste and update the key in Unity

    • Paste the key into the API Key field.

    hashtag
    Troubleshooting

    • Account info doesn’t update

      • Confirm the API key is correct (no extra spaces).

      • Check internet connection and retry.

    • You can’t find your API key

      • Use the linked dashboard/profile documentation section.

    hashtag
    Conclusion

    Your project is now authenticated with Convai. Next, validate the integration using Sample Scenes or continue with Custom Scene Setup.

    circle-info

    Need help? For questions, please visit the Convai Developer Forumarrow-up-right.

    Interface Configuration

    Tailor the visual and functional interface of your avatar experience to match your device, context, and brand needs.

    hashtag
    Customize Your Avatar Experience Interface

    Convai’s Avatar Studio provides a variety of settings to adapt the interface layout, interaction mode, and branding for different platforms and use cases.

    hashtag
    Screen Resolution Presets

    Choose the layout that best fits your deployment:

    • Desktop

    • Tablet

    • Mobile

    This ensures optimal visual presentation across different screen types.


    hashtag
    Chatbox Settings

    Enable or customize the chat interface as needed:

    • Chatbox Type Select your preferred chatbox style from available templates.

    • Disable Chat Interface Use the toggle to hide the chatbox completely if not needed.

    • Push-to-Talk Mode Enable push-to-talk using the toggle for voice-activated interactions.


    hashtag
    Character Vision Through Webcam

    Let your avatar “see” the user and respond accordingly using webcam input.

    • Enable or disable vision-based input with a toggle.

    • Position the webcam within your interface layout.

    • Adjust the webcam display size using a slider for optimal placement.


    hashtag
    Camera Settings

    Control how the avatar scene is viewed by the user.

    • Field of View (FOV): Adjust using the FOV slider (left = narrower, right = wider view)

    • Pan Camera:

      • Up/Down with “Pan Up/Down” slider


    hashtag
    Branding Options

    Integrate your brand identity directly into the avatar experience:

    • Display Logo Toggle “Display Logo” to enable branding elements.

    • Upload Your Logo Click “Upload your brand logo” to add it to your scene.

    • Manage Logo Display

    These configuration tools ensure that your avatar interface not only works smoothly across platforms but also aligns with your project’s style, interaction needs, and brand.

    Experience Settings

    Control idle session handling, welcome interactions, microphone behavior, and input processing timing to fine-tune your avatar experience.

    hashtag
    Final Touches Before Deployment

    These settings define how your avatar behaves during live interaction and how the experience is sustained or terminated based on user activity.


    hashtag
    AFK Timeout

    Set an AFK (Away-From-Keyboard) timeout to manage idle sessions and conserve your pixel-streaming minutes.

    • Open the dropdown menu under AFK Timeout.

    • Select a suitable timeout duration (e.g., 1 min, 5 mins, 10 mins).

    • The session will automatically end based on your selection if there’s no user activity.

    circle-check

    To optimize your account's usage, set an AFK timeout to avoid unnecessary streaming consumption.


    hashtag
    Welcome Message

    Greet users as soon as they enter the experience with customizable welcome interactions.

    Standard Welcome

    • Toggle “Welcome Message” ON.

    • Enable a custom welcome prompt (e.g., “Welcome the user and introduce yourself”).

    • Click “Test Welcome Message” to preview it.

    Vision-Based Welcome

    Make the greeting more dynamic by enabling vision awareness:

    • Toggle “Vision-Based Welcome Message” ON.

    • Add a custom prompt (e.g., “A person approaches—welcome them and make a comment about their attire or an object they’re holding”).

    • Use “Test Message” to preview how the avatar responds based on webcam input.


    hashtag
    User Microphone Settings

    Select the input style that fits your use case:

    • Hands-Free Mode (WIP) Avatar listens continuously.

    • Push-to-Talk Mode Activate microphone input only when the assigned key is pressed.

      • You can assign a custom key for push-to-talk functionality.


    hashtag
    Processing Frequency

    Control how often the avatar processes and reacts to input, allowing it to act more proactively.

    • Open the dropdown menu for processing frequency.

    • Choose a time interval for the avatar to periodically evaluate multimodal inputs (voice, text, vision).

    • This enables agentic behavior — where the avatar can initiate interaction based on user presence or signals.

    triangle-exclamation

    After making changes:

    • Click “Save Changes”

    • If you are creating an Avatar for this character for the first time, press the

    Importing Custom Characters

    Follow these instructions to set up your imported character with Custom Model with Convai.

    To import your custom characters into your Convai-powered Unity project, you will first need to bring your model into your project. The model needs at least two animations: one for talking and one for Idle.

    hashtag
    Prerequisites

    When you want to set up your custom character with Convai, you will need your character model and two animations: Idle and Talking.

    Create an animator controller with the two animations that looks like this. You should also add a 'Talk' Boolean to ensure that you can trigger the animation. . This is the bare minimum animator setup that you need to do.

    hashtag
    Step 1: Add Animator to your custom character

    Select your character from the Hierarchy and Add Animator Component

    Convai Plugin ships with two pre-made animation controller, you can choose these controllers or can assign your custom controller, whatever fits your need. For this demo we are going with Feminine NPC Animator

    hashtag
    Step 2: Adding a Trigger Volume

    With your custom character selected, add a Collision shape of your choice, for this demo we are going with a Capsule Collider

    We will make this Collider a trigger, for this we will enable the Is Trigger option in the inspector panel

    We will adjust the Center, Radius and Height of the collider such that it fits our character

    hashtag
    Step 3: Add ConvaiNPC Component

    With your Custom Character Selection add ConvaiNPC component. By doing so, your Game objectgame should look like this:

    circle-info

    We assume that nothing other than pre-instructed components were added by you; your Game Object component list may be different

    Copy your character's ID and name from and paste them here.

    Now your Custom Character is all set to work with Convai Plugin.

    Compatibility

    Check Convai plugin compatibility with Unity. Ensure smooth integration with your development tools.

    hashtag
    Unity Version

    The minimum supported Unity version is 2022.3.x. Earlier versions may not be compatible.


    hashtag
    Supported Platforms

    Tested Platform
    Scripting Backend
    API Level
    Unity Version
    API Level

    Import & Run Sample Scenes

    Import Convai sample content and run a scene to test a conversation immediately.

    hashtag
    Introduction

    Sample scenes are the fastest way to verify your installation and API setup end-to-end.

    hashtag
    Prerequisites

    • Convai SDK installed

    • API key configured successfully

    hashtag
    Step-by-step

    1

    hashtag
    Open Package Manager

    In Unity, go to Window → Package Manager.

    2

    hashtag
    Troubleshooting

    • Sample doesn’t appear after import

      • Confirm you imported the sample and check the Assets/Samples folder.

    • No voice input detected

    hashtag
    Conclusion

    You’ve successfully imported a sample scene and verified a working conversation. Next, you can integrate Convai into your own scene via Custom Scene Setup.

    circle-info

    Need help? For questions, please visit the .

    Pre-built UI Prefabs

    Convai UI Prefabs - Utilize ready-to-use UI elements for Convai integration.

    We provide several UI options to display character and user's transcript out of the box that players can use with the Convai Plugin. You can use and customize these prefabs.

    The ConvaiNPC and ConvaiGRPCAPI scripts look for GameObjects with Convai Chat UI Handler as a component, and send any transcripts to the script so that it can be displayed on screen.

    hashtag
    Types of UI

    hashtag
    ChatBox

    Prefab Name: Convai Transcript Canvas - Chat

    Both the user's and the character's transcripts are displayed one after other in a scrollable chat box.

    hashtag
    Subtitle

    Prefab Name: Convai Transcript Canvas - Subtitle

    The user and character transcripts are displayed in the bottom like subtitles.

    hashtag
    Question-Answer

    Prefab Name: Convai Transcript Canvas - QA

    The user's transcript is displayed in the top where as the character's transcript is displayed in the bottom.

    hashtag
    Mobile Optimised UI Styles

    Prefab Name: Convai Transcript Canvas - Mobile Subtitle

    Identical to UI. Includes a button that can be pressed and held for the user to speak. Ideal for portrait orientation of screen.

    Prefab Name: Convai Transcript Canvas - Mobile QA

    Prefab Name: Convai Transcript Canvas - Mobile Chat

    hashtag

    hashtag
    Functions to Know

    Compatibility & Requirements

    Supported Unity versions, render pipelines, and target platforms for the Convai Unity SDK.

    hashtag
    At a Glance

    • Minimum Unity Version: 2023.1

    • Recommended Unity Version: Unity 6

    • Render Pipelines: Built-in / URP / HDRP (all supported)

    • Samples: URP Focused Sample Scenes


    Category
    Support
    Notes

    Platform
    Status
    Notes

    hashtag
    Conclusion

    This page summarizes the supported Unity versions and platforms for the Convai Unity SDK. If you’re on Unity 2023.1+ (ideally Unity 6) and targeting one of the supported platforms above, you’re ready to proceed with Getting Started → Installation and Setup.

    circle-info

    Need help? For questions, please visit the .

    Creating Animations for AI Avatars

    Capture lifelike animations using your Meta Quest headset to bring your AI avatars to life—no mocap suit required.

    hashtag
    Overview

    Using the Convai XR Animation Capture app on your Meta Quest headset, you can create custom animations for your AI characters by simply acting them out in VR. These animations help your avatars express themselves naturally during conversations—whether in Unity, Unreal, Avatar Studio, or Convai Sim.

    circle-exclamation

    Haven’t set up the app yet? Head over to the before continuing.


    hashtag
    Recording Animations in VR

    hashtag
    Step 1: Review Existing Animations (Optional)

    When you launch the app, you’ll see your animation dashboard. From here, you can:

    • View previously recorded animations

    • Replay them to see how they look

    • Delete any that you no longer need


    hashtag
    Step 2: Start Recording

    1. In the app, click “Start Recording”.

    2. A five-second countdown will start.

    3. Begin performing your animation.


    hashtag
    Step 3: Stop & Review

    1. Once you're done, click “Stop”.

    2. You can review the recorded animation by pressing the "Replay Animation" button.

      • This helps you decide whether to save, redo, or discard.


    hashtag
    Step 4: Name & Save

    1. Enter a clear and descriptive name (e.g., Wave Greeting, Points Left).

    2. Click “Save & Upload”.

    The animation is now uploaded to your Convai dashboard, ready to be:

    • Assigned to AI avatars

    • Used across Unity, Unreal, Avatar Studio, or Convai Sim


    hashtag
    Keep Building Your Animation Library

    Record and save multiple animations to populate your library. These can be reused across projects, allowing your avatars to intelligently perform gestures during conversations—making your virtual experiences more engaging and realistic.

    Add Chat UI (Transcript UI)

    Add a ready-made chat UI prefab to enable text input and conversation transcripts.

    hashtag
    Introduction

    Chat UI is optional, but it’s extremely useful for debugging, testing without voice, and demonstrating text-based conversations.

    hashtag
    Prerequisites

    • Convai SDK installed

    • A scene with Convai setup

    hashtag
    Step-by-step

    1

    hashtag
    Locate the Transcript UI prefab

    • In the Project window search bar, search:

    hashtag
    Troubleshooting

    • UI doesn’t respond to clicks/typing

      • Confirm there is exactly one EventSystem in the scene.

    • Prefab not found

    hashtag
    Conclusion

    You’ve added the Transcript UI to your scene, enabling text input and readable conversation logs. You can now test conversations via keyboard or microphone.

    circle-info

    Need help? For questions, please visit the .

    Missing Newtonsoft Json

    Fix missing Newtonsoft JSON issues in Unity with Convai. Resolve integration problems efficiently.

    Our plugin has various scripts and dependencies that use Newtonsoft Json. If Newtonsoft Json is missing from the plugin, it could lead to a large number of errors as shown below:

    Ensure that NewtonSoft.Json is present in your packages. Go to your project folder.

    Then navigate to Packages folder. In the Packages folder. Click on manifest.json. A json file containing the project dependacies should open up.

    Add the Newtonsoft Json Package on top.

        "com.unity.nuget.newtonsoft-json": "3.2.1",

    The final manifest.json should look like this.

    Character Emotion

    In this guide, we learn about character emotion coming from server

    Convai Character emit character emotions when they interact with the player and these emotions help in making the character more human-like, we are starting to implement a system which you as a developer can use to make your game more interactive using the character emotions.

    Whenever the character responds to the user, we send back a list of emotions to the SDK, which look something like this

    For v0 of this system, we will only be sending the emotions, in future we will apply the facial expressions corresponding to each emotion which will make the character more interactive.

    Mindview

    Learn how to use the Mindview feature to review the actual prompts to the LLM for your current or previous sessions and interactions.

    triangle-exclamation

    This feature is available only on the Professional Plan and above.

    hashtag
    Introduction

    Narrative Design

    Build goal‑oriented conversation flows using sections, decisions, and triggers that move the story forward without rigid dialogue trees.

    hashtag
    Introduction

    Narrative Design lets you guide a character with high‑level objectives while keeping conversations flexible. Instead of hard coding a tree of lines, you define goals and decision points, then allow the character to respond dynamically. This approach works for many domains such as games, learning and training simulations, tourism, retail assistants, and customer support kiosks. You can read more about the considerations behind Narrative Design


    Tour Guide

    Turn your AI avatar into an interactive tour guide using Convai Sim’s built-in tour planning tools

    hashtag
    Introduction

    In this guide, you'll learn how to bring your AI avatars to life by turning them into dynamic tour guides within immersive 3D environments.

    We’ll walk you through how to:

    Input Management

    Input Management - Efficiently handle input for Convai's Unity plugin integration.

    circle-check

    Make sure that Active Input Handling in

    "Project Settings > Player" is set to Both or Input System Package (New).

    Settings Panel

    Settings Panel - Customize settings using Convai's Unity plugin utilities.

    Settings Panel consists of two main sections.

    • Audio Settings

    • Interface Settings

    Adding NPC to NPC Conversation

    This guide will walk you through setting up the NPC to NPC conversation feature in the Convai SDK.

    hashtag
    Step 1: Setting up Convai NPC

    1. Go to your Convai NPCs:

    Setting Up Unity Plugin

    Follow these instructions to setup the Unity Plugin into your project.

    circle-info

    The file structure belongs to the Core version of the plugin downloaded from the documentation.

    hashtag
    Setting up Unity Plugin

    This document provides step-by-step guidance for new hires.
    Complete HR documentation within the first 3 days of joining.
    Attend the mandatory orientation session.
    Set up company email and access credentials via IT Support.
    Review the Code of Conduct and Data Privacy Policy.
    Alternatively, share the experience link with the invited user to grant access.
    hashtag
    Find the Convai package
    • Select In Project (left panel).

    • Click Convai SDK (or the installed Convai package).

    3

    hashtag
    Import Samples

    • Open the Samples section in the package details.

    • Click Import next to a sample (example: <SAMPLE_NAME>).

    • Expected result: A Samples folder appears under Assets, containing the imported sample content.

    4

    hashtag
    Open the sample scene

    • Navigate to:

      • Assets/Samples/Convai SDK for Unity/x.x.x/<SAMPLE_NAME>/Scenes

    • Open the scene:

      • <SCENE_NAME>

    5

    hashtag
    Run the conversation test

    • Click Play.

    • Speak using your microphone or type into the Chat UI input field.

    • Expected result: The character responds. Microphone conversation is hands-free (no push-to-talk required).

    Check OS microphone permissions for Unity.

  • Confirm the correct microphone device is selected.

  • No response from character

    • Confirm API key is set and valid.

    • Check Console for authentication/network errors.

  • Convai Developer Forumarrow-up-right
    Download here.arrow-up-right
    Here is a YouTube tutorial on how to set up an animator controllerarrow-up-right
    Convai Playground
    The animator controller should look like this. This is the in-box NPC Animator.
    Screenshot showing newly added Animator Component
    Screenshot showing selection of Animation Controller
    Screenshot showing newly added Capsule Collider
    Screenshot showing enable of Is Trigger option
    Screenshot showing newly added ConvaiNPC Component
    Screenshot showing filled-in character information.

    SendCharacterText

    A public function that sends a string of text to be displayed as character transcript along with the name of the character who said it.

    SendUserText

    A public function that sends a string of text to be displayed as user transcript.

    Convai Chat UI Handler Component
    Subtitle

    URP Focused

    Your project can still use Built-in or HDRP

    Editor & Builds

    Android

    Phone/Tablet, Meta Quest Devices, Android AR/VR/MR Devices

    iOS

    iPhone / iPad

    WebGL

    Unity Version

    2023.1+

    Recommended: Unity 6

    Render Pipeline

    Built-in / URP / HDRP

    Sample scenes are primarily URP

    Windows

    Editor & Builds

    macOS

    Editor & Builds

    Convai Developer Forumarrow-up-right

    Sample Content

    Linux

    Sadness

  • Disgust

  • Anger

  • Anticipation

  • And more...

  • Add Conversational AI and Facial Animations to AI Avatars Inside Unity
    {  
        "dependencies": {
            "com.unity.nuget.newtonsoft-json": "3.2.1", 
            "com.unity.animation.rigging": "1.1.1",
            "com.unity.ide.rider": "3.0.16",
            "com.unity.ide.visualstudio": "2.0.16",
            "com.unity.ide.vscode": "1.2.5",
            "com.unity.test-framework": "1.1.33",
            "com.unity.textmeshpro": "3.0.6",
            "com.unity.timeline": "1.6.4",
            .
            .
            .
        }
    }
    Avatar Studio

    Copy the API key.

    Click Update API Key.

  • Expected result: Account details and usage information refresh successfully.

  • Kiosk
    Left/Right with “Pan Left/Right” slider
  • Zoom and Tilt: Adjust using their respective sliders for precise framing.

  • Click the logo to place it in the experience

  • Adjust its position and size using provided controls

  • Logo on Clothing Enable “Display logo on cloth” to embed your logo onto the avatar’s clothing. (Only available for specific clothing items)

  • "Create Character"
    button
    before proceeding to the publishing step.

    Android

    IL2CPP

    .NET 4.x+

    iOS

    IL2CPP

    .NET 4.x+

    MONO

    .NET Standard 2.1 or .NET 4.x+

    IL2CPP

    .NET 4.x+

    Windows

    MONO

    .NET Standard 2.1 or .NET 4.x+

    MacOS

    MONO

    .NET Standard 2.1 or .NET 4.x+

    2022.3 or Higher

    .NET Standard 2.1 or .NET Framework

    Examples: wave, point, gesture, etc.
    Convai XR Animation Capture App Setup Guide

    TranscriptUI_Chat

  • Make sure your search scope includes In Packages or All.

    • Alternatively, you can use the package path:

      • Packages/com.convai.convai-sdk-for-unity/Prefabs/TranscriptUI/TranscriptUI_Chat.prefab

    2

    hashtag
    Add the prefab to your scene

    • Drag and drop TranscriptUI_Chat.prefab into the scene.

    • Expected result: The UI appears in the Hierarchy and is visible in Game view (Play Mode).

    3

    hashtag
    Ensure an Event System exists

    • If your scene does not have an EventSystem:

      • Right-click in Hierarchy → UI → Event System

    • Expected result: An EventSystem exists and UI input works in Play Mode.

    Confirm the package is installed correctly.

  • Confirm your Project window search includes “In Packages” or “All”.

  • Convai Developer Forumarrow-up-right
    The Mindview section provides visibility into the prompt that was sent to the model to generate your character’s response. It’s a powerful tool for:
    • Understanding how your character processes context.

    • Improving your Character Description, Knowledge Bank, and Language Settings.

    • Troubleshooting unexpected or inconsistent responses.


    hashtag
    Accessing Mindview

    You can open the Mindview tab directly from the left navigation menu of the Convai Playground.

    When first opening it, you’ll be asked to select a conversation or interaction from the Memory tab. Alternatively, you can start a new conversation — Mindview will automatically display the data for the latest message.

    To access Mindview for a previous interaction:

    1. Navigate to the Memory tab.

    2. Expand the desired session.

    3. Click the Mindview icon next to any message to open its corresponding prompt view.


    hashtag
    Understanding the Mindview Interface

    Once opened, you’ll see a structured view of how the model interpreted and responded to an input.

    hashtag
    Header Information

    At the top of the screen, the following details are displayed:

    • Session ID – Identifies which session the interaction belongs to.

    • Model Name – Shows the LLM used to generate the response.

    • User Query – Displays the exact message or query that initiated this prompt.

    hashtag
    Main Prompt Section

    This is the core of Mindview. It shows the entire chain of messages (System, Assistant, and User) that formed the complete prompt sent to the model.

    Each section provides insight into how the model understands the character’s context and instructions before producing a response.


    hashtag
    What Influences the Main Prompt

    The main prompt displayed in Mindview is dynamically constructed using multiple aspects of your character and session:

    Source
    Description

    Character Description

    Defines the character’s backstory and core context. Appears within <back-story> ... </back-story> tags.

    Language and Speech

    Includes the allowed languages and relevant speech configuration.

    Personality Traits

    Controls the conversational tone, emotion, and formality level of the character.

    Narrative Design


    hashtag
    Use Cases

    • Debug and refine how your character’s prompt is constructed.

    • Identify missing or conflicting information within the character setup.

    • Validate that the right Knowledge Bank, Personality Traits, and Narrative Design data are being included in responses.


    hashtag
    Conclusion

    The Mindview tab gives creators deep transparency into the inner workings of Convai’s character response generation. By analyzing prompts and understanding how context is layered, you can fine-tune your characters for more consistent, accurate, and personality-aligned interactions.

    hashtag
    Videos

    Watch this series of videos to learn how to create a Narrative Design Graph in the Convai Playground. The demo features a Tour Guide scenario, showing step-by-step how to design, connect, and implement your own Narrative Design flow.


    hashtag
    Accessing Narrative Design

    Open your character in the Convai Playground and select 'Narrative Design' from the left sidebar. You will see a graph editor where you can connect the flow using nodes.


    hashtag
    Narrative Graph

    A narrative graph is made of four building blocks:

    hashtag
    Sections

    A Section contains:

    • Objectives – The goal the character aims to achieve in this part of the narrative. Example: A virtual tour guide’s objective could be to welcome the user and ask if they want to begin the tour.

    • Decisions – Choices based on user interaction that direct the character to different sections. Example: If the user says “yes” to a tour, the next section might start the tour route; if “no,” the character might offer alternative information.

    circle-exclamation

    Ensure decisions are clear and unambiguous; otherwise, the intended section may not be triggered.

    Each Section has a unique ID.


    hashtag
    Triggers

    A trigger is a simple signal from your application indicating that a certain condition has been met. When fired, triggers advance the graph to the next connected section.

    Each Trigger has a unique ID.

    Examples

    • Location Based (Spatial): your app detects the user entered a zone and fires the trigger associated with that Section.

    • Time Based: a timer in your app expires and fires the trigger.

    • Event Based: an in‑app event occurs such as “safety demo completed” and you fire the trigger.


    hashtag
    Example Scenarios

    To better understand how Narrative Design works in practice, here are two example characters you can explore directly in Convai Playground. Open each link, navigate to the Narrative Design tab, and review how the graph is structured with Sections and Triggers.

    hashtag
    Factory Tour Guide – View Characterarrow-up-right

    A training simulation scenario set in a manufacturing facility. This character uses location-based triggers (e.g., entering the conveyor belt area or assembly line) to guide users through the workspace, explain safety protocols, and progress the tour. Ideal for industrial training and onboarding simulations.

    hashtag
    Real Estate Home Tour Guide – View Characterarrow-up-right

    A real estate simulation where the character guides potential buyers through different rooms in a property. Similar to the factory example, it uses location-based triggers — for instance, when the user enters a specific room (e.g., kitchen, bathroom, bedroom), the corresponding Section in the Narrative Graph is triggered. This allows the character to dynamically adapt its dialogue to the user’s movement through the property. Useful for virtual property tours, sales presentations, and customer onboarding.


    hashtag
    Syntax Instructions

    These special characters can be added to nodes in your Narrative Design graph to control specific outcomes and behavior.

    Special Characters
    Example
    Use

    <speak>

    <speak> I'll say this exact line! </speak>

    Forces the character to respond exactly with the phrase inside the tags, without paraphrasing or adding extra context.

    *

    Forces an immediate transition to the next node, bypassing further decision checks.

    here.arrow-up-right
    Set up a tour-guide simulation
  • Define interaction behaviors

  • Add tour points

  • Manage the tour flow

  • Preview and publish the experience

  • hashtag
    Step by Step Guide

    hashtag
    1. Setting Up Tour Guide Mode

    1. Select your avatar to open the Edit & Publish menu.

    2. Locate the Tour Planner Settings section.

    3. Set up your Tour Prompts – these define what the avatar says at the start and end of the tour.

    Example Prompts:

    • Welcome Prompt: “Hi there! Ready to explore the fire station?” → Greet the user, introduce yourself, and invite them to start the tour.

    • End Prompt: “That’s the end of the tour. Hope you had fun!” → Ask if the user has any questions, answer them, then say goodbye.

    hashtag
    2. Defining Behavior

    Choose how your avatar initiates the tour:

    • Wait for Player: Avatar stays still and waits until the user approaches.

    • Engage on Sight: Avatar detects the user visually and initiates conversation.

    • Max (Timed Engagement): Avatar starts interacting after a set period of user inactivity.

    hashtag
    3. Adding Tour Points

    1. Click “Add Tour Point” — a green gizmo will appear in the scene.

    2. Use the XYZ axes or click the flag icon to position the tour marker.

    3. Enter a Tour Point Name (e.g., “Fire Truck”).

    4. Add an Objective describing what the avatar will explain or do at this point (e.g., “Describe the fire truck and its role in emergencies.”).

    5. Repeat this process to build a full tour path.

    hashtag
    4. Managing the Tour

    • To remove a tour point, click the gizmo and hit the X icon.

    • Under User Elements, click “Set User Starting Point” to define where the player begins.

    • Click “Save Narrative Graph” to save your tour configuration.

    • Use Preview to test the experience.

    • Click Publish when you're ready to share your tour.

    hashtag
    Summary

    Convai Sim’s Tour Guide Mode transforms your AI avatar into an interactive, narrative-driven host—ideal for:

    • Education & virtual field trips

    • Employee onboarding

    • Training simulations

    • Museum or product walkthroughs

    Once your tour is complete, you’re just one click away from publishing it across web, kiosks, and other platforms.

    Our recommendation is Both. This way, you can use both the new and old input systems. Using the old input system can be faster when creating inputs for testing purposes.

    hashtag
    How to Change the Talk Button or Any Input?

    1. Double click on the "Controls" asset in your project tab.

    2. You can setup multiple control schemes for different devices here, currently we have it for PC (Keyboard & Mouse) and Gamepad. For mobile, we have provided joystick and buttons, which are mapped to Gamepad controls for functionality, but you can directly add touchscreen and use its different features to trigger an Input Action. You can also add your own control scheme if you want support for a different device by clicking on "Add Control Scheme".

    1. Find the Input Action you want to change in the above window. If you want to add a new Input Action, refer to the other section in documentation. In this case, we selected "Talk Key Action" to change the talk button. Click on "T [Keyboard]". In the Binding Properties window, click on the " T [Keyboard] " button in the Path field.

    2. Press the " Listen " button in the top left of the opened window. If you prefer, you can choose your desired input from the categories below.

    1. Press the key you want to assign and it will be reflected in the control asset.

    hashtag
    How to Add a New Input Action?

    1. First, go to the controls asset mentioned above and use the add button to create an input action. For this example, we will call it interact and provide it the binding with [E] button.

    2. Then, click on the <No Binding> item to setup binding for this action. As before, you can use the listen button (has a UI bug for Windows but works) or you can select the key from dropdown. After selecting the binding (we will select [E] key for this), don't forget to press on the Save Asset option on the top menu.

    3. You will now get an error saying ConvaiInputManager does not implement OnInteract. We need to implement this. Open the " ConvaiInputManager.cs " script to do so. ( " Convai / Scripts / Runtime / Core / ConvaiInputManager.cs " )

    4. Your IDE might suggest you to implement missing members. If it doesn't we can manually write the OnInteract function like in the last figure shown. You receive a callback context which shows which frame input started, performed or got cancelled which you can use for different purposes. And that's it the error should be gone and you are good to go!

    hashtag
    Audio Settings

    hashtag
    Microphone Settings

    The Microphone Settings section is primarily for troubleshooting and testing the microphone when using the Convai plugin.

    • In the Input section, you can view the microphones connected to your computer and select the desired one.

    • In the Test Input field, you can record your voice using the selected microphone in the Input section. After clicking Stop, you can listen to the recorded voice and observe the sound levels.

    hashtag
    Interface Settings

    hashtag
    Appearance

    The first setting that greets us here is the Appearance setting.

    In the Appearance section, you can switch between Transcript UI designs.

    There are three Transcript UI options:

    • ChatBox

    • QuestionAnswer

    • Subtitle

    Upon selecting a UI from the dropdown menu, you can preview it briefly.

    hashtag
    Display Name

    The second section in Interface Settings is the Display Name section. This section allows you to change how the user's name appears in the Transcript UI.

    hashtag
    Notifications Checkmark

    The last section in Interface Settings is the Notifications Checkmark.

    Convai sometimes displays notifications on the screen to inform the user. If you want to disable these notifications, you can click the checkbox here. ( If the box is green, it's active. If empty, it's inactive )

    For more information about notifications, you can refer to this link.

    On the PC platform, you can open the Settings Panel by pressing F10. For mobile platforms, you need to press the Settings button in the UI designs.

    Select the NPCs you want to include in the conversation.

  • Enable Group NPC Controller:

    • Click on the Group NPC Controller checkbox in the inspector panel.

    • Click Apply Changes to add the group NPC controller script.

  • Create or Find the Speech Bubble Prefab:

    • Create a new speech bubble prefab or use the one provided in the Prefabs folder.

  • Attach Required Components:

    • Add the speech bubble prefab and the player transform (optional, defaults to the main camera if not provided).

    • Set the conversation distance threshold variable (set it to zero to disable this feature, meaning NPC to NPC conversations will always happen regardless of the player’s distance).

  • Add Relevant Components:

    • Add components like lip sync, eye and head tracking, character blinking, etc., to the Convai NPC.

  • hashtag
    Step 2: Setting up NPC Manager

    1. Create an NPC To NPC Manager GameObject:

      • Add an empty GameObject and rename it to NPC to NPC Manager (optional).

    2. Add the NPC2NPC Conversation Manager Script:

      • Attach the NPC2NPCConversationManager script to the GameObject.

    3. Configure the NPC Group List:

      • In the NPC Group List, click on the + icon to add a new list element.

      • Add the NPCs you want to include in the group conversation.

    4. Post configuration of NPCs

      • Bring the NPCs close together

      • Play the to make sure everything is working as intended.

    By following these steps you can set up and manage NPC to NPC conversations in your Convai-powered application. For further customization and integration, refer to the complete implementation code and adjust it as needed for your specific use case.

    In the Menu Bar, go the Convai > API Key Setup.

    Go to convai.comarrow-up-right, and sign in to your Convai account. Signing in will redirect you to the Dashboard. From the dashboard, grab your API key.

    Enter the API Key and click begin.

    This will create an APIKey asset in the resources folder. This contains your API Key.

    Open the demo scene by going to Convai > Demo > Scenes > Full Features

    Click the Convai NPC Amelia and add the Character ID (or you can keep the default character ID). You can get the character ID for your custom character from this page . Now you can converse with the character. The script is set up so that you have to go near the character for them to hear you.

    Now you can test out the Convai Demo Scene and talk to the character present there. Her name is Amelia and she loves hiking!

    You can open the Convai NPC Script to replicate or build on the script to create new NPCs.

    circle-exclamation

    Try to extend the ConvaiNPC.cs script instead of directly modifying it to maintain compatibility with other scripts

    Core AI Settings

    Learn how to configure moderation, foundation model selection, and temperature for your AI character

    hashtag
    Introduction

    The Core AI Settings section defines the foundational behavior of your AI character by controlling safety filters, the underlying language model, and the creativity level of its responses. These settings have a significant impact on how your character interacts with users, balancing safety, accuracy, and creativity.


    hashtag
    Main Features

    hashtag
    1. Enable Moderation Filter

    • This setting allows you to filter out potentially harmful content, including hate speech, profanity, or inappropriate language. You can turn the moderation filter on or off using the toggle located at the top of the page. By default, this setting is enabled.

    circle-exclamation

    Disabling the Moderation Filter makes some foundation models unavailable.

    circle-exclamation

    Features like Narrative Design and Multilingual support will not work when moderation is disabled.


    hashtag
    2. Select Foundation Model

    Choose from a variety of Large Language Models (LLMs) from leading providers:

    • OpenAI

    • Anthropic

    • Google

    • Llama

    circle-info

    Model availability depends on whether the Moderation Filter is enabled.


    hashtag
    Supported LLMs

    Below is a list of Large Language Models (LLMs) available in the Convai Playground under Core AI Settings. Models marked as ✅ Flagship are the providers’ top-tier, most capable models — but usage of these is subject to the Flagship Interaction Cap based on your plan.

    Flagship LLMs This is the limit on the number of interactions you can perform using Flagship LLMs.

    Example: In the Indie Dev plan, you have a total monthly quota of 3000 Interactions. However, the Flagship LLM Interaction Cap is 1500. If you use GPT-4.1 after 1500 interactions, your Flagship LLM quota will be exhausted. You will then need to switch to a non-Flagship LLM for the remaining 1500 interactions.


    hashtag
    OpenAI

    Model
    Flagship

    hashtag
    Anthropic

    Model
    Flagship

    hashtag
    Google

    Model
    Flagship

    hashtag
    Llama

    Model
    Flagship

    hashtag
    3. Temperature Control

    • Function: Adjusts the randomness and creativity in the AI’s responses.

    • Slider Range: 0.0 (most deterministic) to 1.0 (most creative).

    Temperature Range
    Behavior
    Use Case
    circle-info

    Lower temperature sharpens the probability distribution for more predictable word choices.

    Higher temperature flattens the distribution, allowing less likely words to appear more frequently.


    hashtag
    Conclusion

    The Core AI Settings give you precise control over your character’s foundation model, safety filters, and response style. By adjusting these parameters, you can create an AI that balances safety, reliability, and creativity to suit your specific application.

    Metahuman Avatars

    Upload custom Metahuman characters from Unreal Engine to Avatar Studio using the Convai Asset Uploader.

    hashtag
    Introduction

    This guide walks you through uploading custom Metahuman avatars to Avatar Studio using the Convai Asset Uploader. You'll generate a new project tailored for Metahumans, import your Metahuman asset, configure it, and finally upload it using Convai’s built-in tools.

    hashtag
    Prerequisites

    Before you begin:

    • Create your project using the , and answer Y when asked if you’re using a Metahuman.

    • Ensure you have a downloadable Metahuman available via Quixel Bridge.


    hashtag
    Step-by-Step Guide

    hashtag
    1. Open the Project

    Navigate to the folder where your project was created. Double-click the YourProjectName.uproject file to open it in Unreal Engine.


    hashtag
    2. Add a Metahuman via Quixel Bridge

    • Go to Window > Quixel Bridge.

    • In Bridge, select Metahumans from the left-hand menu.

    • Pick a Metahuman and click:


    hashtag
    3. Locate and Open the Character Blueprint

    After importing:

    • Go to Content/Metahumans/<CharacterName>/.

    • Open the Blueprint: BP_<CharacterName>

      ⏳ This may take some time to load.


    hashtag
    4. Fix Compile Errors

    If you see compile errors:

    • In the bottom-right, click Enable Missing under any Missing Plugins or Missing Project Settings notices.

    • Click Restart Now when prompted.

    • Reopen the Blueprint and ensure it compiles successfully.


    hashtag
    5. Prepare the Asset for Upload

    1. Locate the folder: Plugins/<random code> Content/ (e.g., Plugins/AHK3LNKVC7FZA3I5JG3V Content/)

    2. Move the entire Content/Metahumans/ folder into this directory.

    circle-info

    This folder determines what gets packaged and uploaded. Make sure everything is placed correctly.


    hashtag
    6. Open the Asset Uploader Tool

    • Navigate to Content/Editor/AssetUploader.

    • Right-click on AssetUploader and select Run Editor Utility Widget.


    hashtag
    7. Select the Character Asset

    • Navigate to the Plugins/<random code> Content/Metahumans/<CharacterName>/ directory.

    • Select the BP_<CharacterName> Blueprint.

    • Then, in the Asset Uploader window, click Pick Asset.


    hashtag
    8. Capture a Thumbnail

    • In the Asset Uploader window, click Capture Thumbnail to generate a preview image for your avatar.


    hashtag
    9. Verify Functionality Before Upload

    1. Drag BP_<CharacterName> into the Level.

    2. Select the character and locate BP_ConvaiChatbotComponent in the Details panel.

    3. Input a test Character ID.


    hashtag
    10. Upload the Avatar

    • In the Asset Uploader, click Create Asset.

    • This will:

      • Package the avatar for Win64

    Monitor the Output Log:

    • Look for Package completed

    • Then wait for Uploaded Asset

    circle-exclamation

    If there’s an error during packaging, check the logs and share them on the for support.

    circle-info

    To delete a previously uploaded asset, open AssetUploader and click Delete.


    hashtag
    Accessing the Avatar

    1. Go to

    2. Open the Upload Your Custom Avatar section

    3. Your Metahuman will appear, ready for use.


    hashtag
    Summary

    Using the Convai Asset Uploader, uploading custom Metahuman avatars is quick and reliable. With proper setup and a few clicks, your characters are live in Avatar Studio and ready for real-time AI interaction.

    Reallusion Avatars

    Upload custom Reallusion characters from Unreal Engine to Avatar Studio using the Convai Asset Uploader.

    hashtag
    Introduction

    This guide explains how to prepare and upload Reallusion-based avatars using the Convai Asset Uploader. You’ll import your Reallusion character and animations, apply Convai’s animation and lipsync systems, and then upload your avatar to Avatar Studio using the built-in AssetUploader tool.


    hashtag
    Prerequisites

    Make sure you have the following ready:

    • A project created with the , where you answered N to “Are you using a Metahuman?”

    • A custom Reallusion character exported and ready for import


    hashtag
    Step-by-Step Guide

    hashtag
    1. Open the Project

    Navigate to your project directory and open the .uproject file to launch it in Unreal Engine.


    hashtag
    2. Import Reallusion Character & Animations

    Follow this to import your Reallusion assets:

    • [00:00 – 07:25]: Import your character and animations

    • [07:50 – 08:20]: Create a new Blueprint Class for your character


    hashtag
    3. Connect Convai Animations

    Now we’ll bind the correct animation logic to your character.

    We’ve already added the necessary Animation Blueprint for you:

    • Go to Content/ConvaiReallusion/

    • Locate and assign the ConvaiReallusion Animation Blueprint to your character’s Skeletal Mesh

    circle-info

    This blueprint ensures that your Reallusion character plays proper idle/talking animations in sync with Convai interactions.

    • Refer to the for this step: [10:12 – 12:48]


    hashtag
    4. Add FaceSync for Lipsync

    To enable lipsync:

    • Add the FaceSync component to your character’s Blueprint

    • See how in the same : [12:48 – 12:56]


    hashtag
    5. Set Correct Rotation

    Reallusion characters typically face the wrong direction by default. Fix this by:

    • Opening the character Blueprint

    • Selecting the SkeletalMesh component

    • Set the Z Rotation to -90 in the Details panel


    hashtag
    6. Prepare Files for Upload

    1. Go to: Plugins/<random code> Content/ (e.g., Plugins/AHK3LNKVC7FZA3I5JG3V Content/)

    2. Drag and Move both of the following folders into this directory:

      • Your character’s folder (containing the Blueprint and animations)

    circle-info

    This folder determines what gets packaged and uploaded. Make sure everything is placed correctly.


    hashtag
    7. Open the AssetUploader Tool

    • Navigate to Content/Editor/AssetUploader

    • Right-click and select Run Editor Utility Widget


    hashtag
    8. Select the Character Asset

    • Navigate to Plugins/<random code> Content/YourCharacterFolder/

    • Select your character’s Blueprint Class

    • Then, in the Asset Uploader window, click Pick Asset


    hashtag
    9. Capture a Thumbnail

    Click Capture Thumbnail to create a preview image that will appear in Avatar Studio.


    hashtag
    10. Verify Before Upload

    Before uploading, do a quick functional test:

    1. Drag the character into your Level

    2. Select it and locate the BP_ConvaiChatbotComponent in the Details panel

    3. Paste in a test Character ID


    hashtag
    11. Upload the Avatar

    • In the Asset Uploader, click Create Asset

    • This triggers:

      • Packaging the asset for Win64

    Monitor the Output Log:

    • Wait for Package completed

    • Then look for Uploaded Asset

    circle-exclamation

    If there’s an error during packaging, check the logs and share them on the for support.

    circle-info

    To delete a previously uploaded asset, open AssetUploader and click Delete.


    hashtag
    Accessing the Avatar

    1. Visit

    2. Go to Upload Your Custom Avatar

    3. Your Reallusion character will now be available for selection and use


    hashtag
    Summary

    Using the Convai Asset Uploader, uploading Reallusion avatars is quick and reliable. With proper setup and a few clicks, your characters are live in Avatar Studio and ready for real-time AI interaction.

    Managing sessionID Locally

    Session ID Management - Manage unique session IDs for Convai Unity integration.

    In a typical application integrating with the Convai API, maintaining a consistent session ID across different sessions is crucial for providing a seamless user experience. This documentation outlines the best practices for storing and retrieving session IDs using Unity's PlayerPrefs, including detailed steps and example scripts.

    hashtag
    Importance of Session IDs

    A session ID uniquely identifies a session between the client and the Convai server. Storing the session ID locally ensures that the same session ID is used across different sessions, which helps in maintaining context and continuity in interactions.

    hashtag
    Storing Session IDs

    When initializing a session, if a session ID is not available locally, it should be fetched from the server and then stored locally for future use. Here's how you can achieve this:

    1. Fetch and Store Session ID: When initializing a session, check if a session ID is stored locally. If not, fetch a new session ID from the server and store it using PlayerPrefs.

    hashtag
    Retrieving Session IDs

    When initializing your application, retrieve the stored session ID to ensure continuity in user interactions.

    hashtag
    Example Class for Session Management

    The following example class demonstrates how to manage session IDs using PlayerPrefs in a Unity project:

    hashtag
    Detailed Steps for Session Management

    1. Initialize Session: Call InitializeSessionIDAsync to check if a session ID is stored. If not, fetch and store it.

    2. Store Session ID: Use PlayerPrefs.SetString(characterID, sessionID) to store the session ID locally.

    3. Retrieve Session ID: Use

    hashtag
    Best Practices

    • Error Handling: Ensure proper error handling when fetching and storing session IDs.

    • Security: Consider encrypting sensitive information stored in PlayerPrefs.

    • Performance: Use asynchronous methods to avoid blocking the main thread when fetching session IDs.

    Adding Narrative Design to your Character

    Follow this guide to incorporate Narrative Design into your Convai-powered characters. Follow this step-by-step tutorial, open your project, and let's begin!

    hashtag
    Convai Playground

    hashtag
    Step 1: Select your Character in which you want to enable Narrative Design

    circle-info

    For this demo, we are using Seraphine Whisperwind, you can select whatever character you want to enable Narrative Design.

    hashtag
    Step 2: Open Narrative Design in Convai Playground

    Select the Narrative Design option from the side panel and create your narrative design

    circle-info

    For more information how to create narrative design in the please refer to the following YouTube video series

    For this sample we have created the following Narrative design

    You are all set to bring your character from Convai Playground to Unity, let's hope over to Unity to continue the guide

    hashtag
    Unity Setup

    hashtag
    Step 1: Add the Narrative Design Manager Component

    hashtag
    Using Add Components Button in Convai NPC (Recommended Way)

    hashtag
    1: Select your Convai Character in the scene and look for ConvaiNPC component in the inspector panel. Click on Add Components button

    hashtag
    2: Select Narrative Design Manager checkbox and then click on Apply Changes button

    hashtag
    Using Unity Inspector

    hashtag
    1: Select your Convai Character and find Add Component button in the inspector panel

    hashtag
    2: Search for Narrative Design Manager in the search box and select it

    hashtag
    Step 2: Setup the Narrative Design Component

    After adding the Narrative Design Component, you will be able to be the following component

    triangle-exclamation

    This component system assumes that API key is setup correctly, so ensure that API key is setup correctly otherwise an error will be thrown.

    circle-info

    After adding, component will retrieve the sections for the character ID taken from the ConvaiNPC, please wait for some time depending upon your network speed

    circle-exclamation

    The following section events are for character used in demo, and you will see section events corresponding to your character in which Narrative Design is enabled.

    hashtag
    Getting to know the Narrative Design Component

    Expanding the section event, you will see two unity events you can subscribe to, one is triggered when section starts, and another one is triggered when section ends

    hashtag
    Getting to know about Section Triggers

    Section triggers are a way to directly invoke a section in narrative design and can be used to jump to a different section in your narrative design

    hashtag
    Step 1: Select the game object you want to make a trigger, in this example we have selected a simple cube, but it's up to your imagination.

    circle-info

    Make sure that game object you have decided to be a trigger have a collider attach to it

    hashtag
    Step 2: Add Narrative design Trigger from Add Component menu by searching for it

    hashtag
    Step 3: Make the collider a trigger.

    hashtag
    Step 4: Assign your Convai NPC to Convai NPC field

    Now you can select from the "Trigger" dropdown which trigger should be invoked when player enters this trigger box.

    We have added a way for you to manually invoke this trigger also, you can use InvokeSelectedTrigger function to invoke the trigger from any where

    hashtag
    Invoke Trigger from any script

    You can use this code block as a reference to invoke the trigger from anywhere

    Long Term Memory

    Learn how to enable character retain conversation history across multiple sessions

    Long-Term Memory (LTM) enables the persistent storage of conversational history with NPCs, allowing players to seamlessly continue interactions from where they previously left off, even across multiple sessions. This feature significantly enhances the realism of NPCs, aligning with our goal of creating more immersive and lifelike characters within your game.

    circle-info

    Prerequisite: Have a project with Convai SDK version 3.1.0 or higher. If you don't have it, check this documentation

    Setting Up Unity Plugin

    hashtag
    Steps to get LTM working

    1. Select your Convai Character

    2. Add the Long-Term Memory Component onto your character

    3. Make sure that Long Term Memory is enabled for that character

    Long Term Memory should now be working for your character.

    hashtag
    Components of the LTM System

    hashtag
    Convai Long Term Memory Component

    This component will enable or disable LTM right from the unity editor

    Toggling Long Term Memory

    1) Click the button provided in the component

    2) It will take some time to update, and after that the new status of the LTM should be visible in the inspector.

    circle-exclamation

    Since enabling or disabling Long-Term Memory (LTM) for a character is a global action that impacts all players interacting with that character, we strongly recommend against toggling the LTM status at runtime. This functionality should be managed exclusively by developers or designers through the editor to ensure consistent gameplay experiences.

    hashtag
    Troubleshooting

    triangle-exclamation

    Grpc.Core.RpcException: Status(StatusCode=InvalidArgument, Detail="Cannot find speaker with id: 99fbef96-5ecb-11ef-93ce-42010a7be011.")

    If you encounter this error, ensure that the SpeakerID was created using the same API key currently in use. If you're uncertain about the API key used, you can reset the SpeakerID and PlayerName by accessing the ConvaiPlayerDataSO file located in Assets > Convai > Resources, allowing you to start the process anew.

    hashtag
    Management of Speaker ID(s)

    It is essential for developers to efficiently manage the Speaker ID(s) generated using their API key, as the number of IDs that can be created is limited and dependent on the subscription tier. Proper management ensures optimal usage of resources and prevents potential disruptions in the application's functionality.

    hashtag
    Speaker ID limit per API key are as follows

    Tier
    Limit

    You can view all the Speaker ID(s) associated with a specific API key by accessing the Convai Window within your Unity project. This feature provides a comprehensive list of IDs, allowing for easier management and monitoring.

    triangle-exclamation

    Ensure that the API key is correctly entered; otherwise, the feature will not function as expected. Accurate API key input is critical for accessing and managing Speaker ID(s) through the Convai Window in Unity.

    Head over to Long Term Memory Section

    If the message "No Speaker ID(s) Found" appears, there is no need to proceed with this guide. However, if a Speaker ID list is displayed, it's advisable to delete any ID(s) that are no longer in use or needed to optimize your available resources.

    Adding Lip-Sync to your Character

    Learn to add lip sync to your Unity characters using Convai. Improve realism and interactivity.

    hashtag
    Lip Sync System

    Convai sends Visemes or Blend Shape Frame from back-end depending upon the face model the developer chooses to use and when returned Convai SDK out of the box extracts and parses it and provides it to the Convai LipSync Component, after which the component relies on the SkinMeshRenderer's Blendshape Effectors and Bone Effectors to give Convai powered NPC's realistic lipsync.

    hashtag
    Components of LipSync System

    hashtag
    Viseme Effector List

    This is where the developer will tell the Convai SDK, which index of Blendshape Array will be effector how much from which value. To better explain its working let's understand it with a diagram.

    Here, it is saying that whatever value is coming from the server will affect Blendshape at the 116th index by 0.2 multipliers and Blendshape at the 114th index by 0.5 multipliers. The engine representation of this would look something like this.

    So, you can make your own Effector list or use one of the many that we ship in the SDK.

    How to Create your own Viseme Effector List

    Right click inside project panel and head over to Create > Convai > Expression > Viseme Skin Effector which will create a Viseme Effector List Scriptable Object and now you can define your own values.

    hashtag
    Viseme Bone Effector List

    This is where developer will tell the Convai SDK, how much each value coming from the server will affect the rotation of the bone. To better explain its working let's understand it with a diagram.

    Here, bone's rotation will be affected by the values coming from the server multiplied by the values in effects. For example, for TH the value will affect the bone's rotation by a 0.2 multiplier and etc. The engine representation of this would look something like this.

    So, you can make your own Bone Effector list or use one of the many that we ship in the SDK.

    We use this formula to calculate the rotation

    How to Create Your Own Viseme Bone Effector List

    Right click inside the project panel and head over to Create > Convai > Expression > Viseme Bone Effector which will create a Viseme Bone Effector List Scriptable Object and now you can define your own values.

    hashtag
    Convai Lipsync Component

    When you attach this component to your Convai Character, you will see something like this.

    Let's learn what these learns are

    1. Facial Expression Data

      1. Head | Teeth | Tongue

        1. Renderer: Skin Mesh Renderer which corresponds to that specified part of the body

    hashtag
    Steps to add Lipsync to your Convai Character

    1. Select you Convai Powered Character in the hierarchy.

    2. In the inspector panel search for ConvaiNPC component, there you will see Add Component Button.

    3. Click on it and select Convai Lipsync Component and click on apply

    Now you can configure the Component according to your custom configuration or use one of the many Presets Convai ships with the SDK

    Now your lipsync component would be ready to use in your application.

    Migration Guide

    Convai Plugin 3.3.4 to 4.0.0

    This guide explains how to migrate a Unity project from the old Convai SDK to the latest Convai SDK.

    circle-exclamation

    Important: Back Up Your Project

    Before you begin, create a full backup of your Unity project to avoid accidental data loss.

    1

    hashtag
    Remove the old Convai SDK

    1. Open your Unity project.

    2. In the Project window, go to Assets

    2

    hashtag
    Install the latest Convai SDK

    Install the newest SDK using one of the following:

    3

    hashtag
    Set up API key

    hashtag

    4

    hashtag
    Update scene setup

    Update these key objects in your scene:

    5

    hashtag
    Lip Sync setup (optional)

    If your character is humanoid and uses facial lip movement:

    Custom Scene Setup

    Add the Convai Manager, set up a player, and connect characters to Convai.

    hashtag
    Introduction

    This guide shows how to integrate Convai into your own Unity scene by adding the Convai Manager, creating a Convai Player, and configuring Convai Characters.

    hashtag
    Prerequisites

    • Convai SDK installed

    • API key configured successfully

    • Your scene opened in Unity

    hashtag
    Step-by-step

    1

    hashtag
    Add the Convai Manager

    • In Unity top menu, go to GameObject → Convai → Setup Required Components or Right-click in the Hierarchy → Convai → Setup Required Components

    hashtag
    Troubleshooting

    • Validation fails

      • Confirm that a Convai Manager object exists in the scene.

      • Ensure you added Convai Player Component to a player object.

    hashtag
    Conclusion

    You’ve integrated Convai into your custom scene, validated the setup, and confirmed characters can respond. Next, optionally add Chat UI to support text input and transcripts.

    circle-info

    Need help? For questions, please visit the .

    Importing Ready Player Me (RPM) Characters

    This guide walks you through the process of importing Ready Player Me (RPM) characters into a Convai-powered Unity project, configuring them, and integrating Convai NPC components.

    hashtag
    Introduction

    Ready Player Me (RPM) allows users to create and customize 3D avatars easily. By integrating RPM characters into Convai's Unity SDK, you can bring dynamic NPCs to life with advanced AI-driven interactions. This guide covers the step-by-step process to set up RPM characters in your Unity project with Convai.

    Building for iOS/iPadOS

    This guide will walk you through the process of installing Convai-powered Unity applications on iOS and iPadOS devices.

    hashtag
    Prerequisites

    Before you begin, make sure you have the following:

    • Unity 2022.3 or later

    macOS Permission Issues

    macOS security permission issue with custom DLLs in Unity and Mac Configuration in build settings

    hashtag
    Allowing the grpc_csharp_ext.bundle dll file in macOS

    Using external DLLs in Unity on MacOS can lead to security permission issues due to Apple's strict security measures. Here's a step-by-step guide to resolving this common problem.

    WebGL
    Oculus
    Download (if not already downloaded)
  • Then Add to include it in your project.

  • Use Move Here to complete the action.
  • The final structure should mirror what’s shown in the screenshot.

  • Press Play and confirm:

    • Animations are working

    • Lip sync is functional

    • Character behaves as expected

    Upload it to Avatar Studio
    Convai Asset Uploader
    Convai Developer Forumarrow-up-right
    Avatar Studioarrow-up-right
  • Content/ConvaiReallusion/ (contains the animation blueprint)

  • The final structure should mirror what’s shown in the screenshot.

  • Press Play and verify:

    • Animation is working

    • Lipsync is functioning

    • Character is correctly positioned and oriented

    Uploading to Avatar Studio
    Convai Asset Uploader
    video tutorialarrow-up-right
    tutorialarrow-up-right
    videoarrow-up-right
    Convai Developer Forumarrow-up-right
    Avatar Studioarrow-up-right

    Incorporates objectives or context from active Narrative Design sections into the user’s input.

    Knowledge Bank

    Adds relevant external knowledge to improve factual accuracy or domain-specific responses.

    Long-Term Memory

    Injects persistent information learned across sessions, when applicable.

    GPT-4o-mini

    Gemma-3n-e2b

    Diverse, creative, sometimes unpredictable

    Storytelling, brainstorming, roleplay

    GPT-4.1

    GPT-4o

    GPT-4.1-mini

    GPT-4.1-nano

    Claude-Opus-4.1

    Claude-Opus-4

    Claude-4-Sonnet

    Claude-3.7-Sonnet

    Gemini-2.5-Flash

    Gemini-2.5-Flash-Lite

    Gemini-2.0-Flash

    Gemma-3n-e4b

    Llama-4-Maverick

    Llama-4-Scout

    Llama-3.3-70B

    Low (0.0–0.3)

    Deterministic, consistent

    Factual Q&A, compliance-critical interactions

    Medium (0.4–0.7)

    Balanced accuracy and creativity

    Conversational agents, customer support

    High (0.8–1.0)

    Quick Guide On Adding AI Characters to Your Unity Project
    PlayerPrefs.GetString(characterID, string.Empty)
    to retrieve the stored session ID.
  • Use Session ID: Pass the session ID to your Convai API calls to maintain session continuity.

  • public static async Task<string> InitializeSessionIDAsync(string characterName, ConvaiService.ConvaiServiceClient client, string characterID)
    {
        // Retrieve stored session ID if it exists
        string sessionID = PlayerPrefs.GetString(characterID, string.Empty);
    
        // If no session ID is stored, initialize a new one
        if (string.IsNullOrEmpty(sessionID))
        {
            sessionID = await ConvaiGRPCAPI.InitializeSessionIDAsync(characterName, client, characterID, sessionID);
    
            // Store the new session ID locally
            if (!string.IsNullOrEmpty(sessionID))
            {
                PlayerPrefs.SetString(characterID, sessionID);
                PlayerPrefs.Save();
            }
        }
    
        return sessionID;
    }
    private async void Start()
    {
        // Initialize session ID on start
        string characterID = "YourCharacterID"; // Replace with your actual character ID
        string sessionID = await InitializeSessionIDAsync("CharacterName", grpcClient, characterID);
    
        if (!string.IsNullOrEmpty(sessionID))
        {
            Debug.Log("Session ID initialized and stored: " + sessionID);
        }
        else
        {
            Debug.LogError("Failed to initialize session ID.");
        }
    }
    using System;
    using System.Threading.Tasks;
    using Convai.Scripts.Utils;
    using Google.Protobuf;
    using Grpc.Core;
    using Service;
    using UnityEngine;
    using static Service.GetResponseRequest.Types;
    
    public class SessionManager : MonoBehaviour
    {
        public ConvaiService.ConvaiServiceClient grpcClient;
    
        private void Start()
        {
            // Initialize session ID on start
            InitializeSession("CharacterName", grpcClient, "YourCharacterID");
        }
    
        private async void InitializeSession(string characterName, ConvaiService.ConvaiServiceClient client, string characterID)
        {
            string sessionID = await InitializeSessionIDAsync(characterName, client, characterID);
    
            if (!string.IsNullOrEmpty(sessionID))
            {
                Debug.Log("Session ID initialized and stored: " + sessionID);
            }
            else
            {
                Debug.LogError("Failed to initialize session ID.");
            }
        }
    
        public static async Task<string> InitializeSessionIDAsync(string characterName, ConvaiService.ConvaiServiceClient client, string characterID)
        {
            string sessionID = PlayerPrefs.GetString(characterID, string.Empty);
    
            if (string.IsNullOrEmpty(sessionID))
            {
                sessionID = await ConvaiGRPCAPI.InitializeSessionIDAsync(characterName, client, characterID, sessionID);
    
                if (!string.IsNullOrEmpty(sessionID))
                {
                    PlayerPrefs.SetString(characterID, sessionID);
                    PlayerPrefs.Save();
                }
            }
    
            return sessionID;
        }
    }

    Set the group discussion topic.

    Personal

    1

    Gamer / Indie / Professional

    5

    Partner / Enterprise

    100 (Can be Customized)

    Viseme Effectors List: How the SkinMeshRenderer's Blendshape will be affected by values coming from server.
  • Jaw | Tongue Bone Effector

    1. How much of Bone's rotation will be affected by values coming from server?

  • Jaw | Tongue Bone

    1. Reference to the bone which controls jaw and tongue respectively

  • Weight Blending Power

    1. Percentage to interpolate between two frames in late update.

  • Character Emotions

    1. Learn More about Character Emotions here Character Emotion

    1. Select you Convai Powered Character in the hierarchy.

    2. Click on Add Component

    3. Search for Convai Lipsync

    4. Select Convai Lipsync component

    .
  • Locate the Convai folder from the old SDK.

  • Delete the entire folder.

  • After removal, Unity may show compile errors until all references are migrated.

    hashtag
    Option A: Unity Asset Store
    1. Open Unity Asset Store.

    2. Search for Convai SDK.

    3. Download and import the latest package.

    hashtag
    Option B: Plugin Manager

    1. Open Plugin Manager.

    2. Install the latest Convai plugin.

    Open the Convai Account window in Unity

    In the Unity top menu, go to Convai → Account.

    hashtag
    Copy your API Key from Convai

    1. Open Convai in your browser and sign in.

    2. Locate your API Key in the dashboard/profile settings.

    3. Copy the API key.

    hashtag
    Paste and update the key in Unity

    • Paste the key into the API Key field.

    • Click Update API Key.

    • Expected result: Account details and usage information refresh successfully.

    hashtag
    Replace Convai Essentials with ConvaiDefaults
    1. Remove ConvaiDefaults from the scene.

    2. Add ConvaiDefaults From "Convai SDK For Unity/Prefabs/Setup"

    You can add it by either:

    • Drag the ConvaiDefaults prefab from the Prefabs/Setup folder into the scene.

    • Searching for ConvaiDefaults in the Project window and adding it manually.

    hashtag
    Replace ConvaiNPC with ConvaiCharacter

    1. Select each NPC character object.

    2. Remove missing/legacy Convai components (if any).

    3. Add the Convai Character component.

    Audio setup

    1. After adding ConvaiCharacter, use the setup button shown in the inspector.

    2. This will automatically add/configure an Audio Source component.

    Select the character object.
  • Add the Convai Lip Sync component.

  • Configure visemes/blendshapes according to your avatar setup.

  • You can explore more about adding Lip Sync Herearrow-up-right

    • Confirm and proceed in the popup.

    • Expected result: A single Convai Manager object exists in your scene.

    2

    hashtag
    Create or select your Player object

    • Select your existing player GameObject, or create an empty one.

    • Add Convai Player Component.

    • Set Player Name.

    • Expected result: The scene has exactly one configured Convai Player.

    3

    hashtag
    Add Convai Character components

    For each character GameObject you want to make conversational:

    • Add Convai Character component

    • Set Character ID

    4

    hashtag
    Get your Character ID from Convai

    • In Convai dashboardarrow-up-right, open your character and copy its ID.

    5

    hashtag
    Validate your scene setup

    • Use one of the validation options:

      • Top menu: GameObject → Convai → Validate Scene Setup

      • Hierarchy right-click: Convai → Validate Scene Setup

    • Expected result: A success message like:

      • “Scene setup is correct!” and the number of Convai Characters found.

    6

    hashtag
    Run a conversation test

    • Press Play

    • Speak using microphone or use Chat UI if present

    • Expected result: Characters respond. Microphone conversation is hands-free (no push-to-talk required).

    Ensure each character has a valid Character ID.

  • Characters don’t respond

    • Confirm API key is set.

    • Check Console for network/auth errors.

  • Convai Developer Forumarrow-up-right
    hashtag
    Prerequisites

    Before getting started, ensure you have the following:

    • A Ready Player Mearrow-up-right account

    • A model link for your RPM character

    • A Unity project with the Convai SDK installed and working

    hashtag
    Step-by-Step Guide

    hashtag
    Step 1: Install Ready Player Me SDK in Unity

    1. Open Unity and navigate to Window > Package Manager.

    1. Click the (+) icon and select Install Package from Git URL.

    1. Enter the following Git URL and click Install:

    https://github.com/readyplayerme/rpm-unity-sdk-core.git#f6ea3c4b0a8891b7c4c1d7b269cee545185549fb

    1. Wait for the installation to complete.

    hashtag
    Step 2: Configure the RPM Avatar

    1. In the Project Panel, navigate to: Assets > Ready Player Me > Resources > Settings

    1. Right-click inside the folder and go to Create > Ready Player Me > Avatar Configuration.

    1. This will generate an Avatar Config asset.

    2. Select the created asset and, under the Inspector Panel, locate the Morph Targets section.

    3. Click Add, select the required morph targets (Oculus Visemes and ARKit), and save the asset.

    1. Locate Assets > Ready Player Me > Resources > Settings > AvatarLoaderSettings and assign the Avatar Config asset to the Avatar Config field.

    1. Save the asset.

    hashtag
    Step 3: Import the RPM Character

    1. Navigate to Tools > Ready Player Me > Avatar Loader.

    1. Paste or enter your RPM Model Link in the provided input field.

    1. Click Load Avatar into Current Scene to import your character.

    hashtag
    Step 4: Integrate Convai Components

    1. Select your imported RPM GameObject in the Hierarchy Panel.

    2. Add the Convai NPC component to the GameObject.

    3. Fill in the name and ID of the Convai NPC you wish to integrate.

    1. Click Add Components inside the Convai NPC component.

    1. Choose the components you want and click Apply Changes.

    1. Attach a Capsule Collider to the GameObject and configure its size and center to align with the character's body proportions. Ensure that the collider accurately encapsulates the character for optimal physics interactions and collision detection.

    1. Assign an Animation Controller to the Animator component of the GameObject. The Convai SDK offers two predefined animation controllers (Feminine and Masculine) that you can use. Alternatively, you can integrate a custom controller tailored to your requirements.

    circle-check

    Enhance your character with additional features:

    • Add LipSync: Follow this guide to integrate LipSync into your character.

    • Implement Narrative Design: Check out to add Narrative Design.

    • Set up Actions: Explore action-based interactions using .

    hashtag
    Conclusion

    You have successfully integrated a Ready Player Me character into your Convai-powered Unity project. You can now leverage Convai’s capabilities to bring intelligent, interactive NPCs to life. 🎉😎

    circle-info

    For more details about Ready Player Me, visit Ready Player Mearrow-up-right.

    Xcode (latest version recommended)

  • Apple Developer account

  • Project with Convai's Unity SDK integrated and running properly

  • MacBook for building and deploying to iOS/iPadOS

  • hashtag
    Step 1: Prepare Your Unity Project

    1. Open your Convai-powered Unity project.

    2. Ensure you have the latest version of the Convai Unity SDK imported and setup into your project.

    Unity project with Convai SDK imported

    hashtag
    Step 2: Configure Build Settings

    1. In Unity, go to File → Build Settings.

    2. Select iOS as the target platform.

    3. Click Switch Platform if it's not already selected.

    4. Check the Development Build option for testing purposes.

    Unity Build Settings window with iOS selected and Development Build checked

    circle-info

    If you wish to add a few required files manually, follow step 3. If you want it to be done automatically, jump to step 4

    hashtag
    Step 3: Manually add Required Files

    hashtag
    Add link.xml

    1. Create a new file named link.xml in your project's Assets folder.

    2. Add the following content to the file:

    Unity project view showing the link.xml file in the Assets folder

    This file prevents potential FileNotFoundException errors related to the libgrpc_csharp_ext.x64.dylib file.

    hashtag
    Add BuildIos.cs Script

    1. Create a new C# script in Assets/Convai/Scripts named iOSBuild.cs.

    2. Add the following content to the script:

    hashtag
    Step 4: Install required gRPC dlls for iOS:

    1. Go to Convai -> Custom Package Installer

    2. Click on Install iOS Build Package

    3. Attach the script iOSBuild.cs to any GameObject in your scene.

    hashtag
    Step 5: Build the Xcode Project

    1. In Unity, go to File → Build Settings.

    2. Click Build and choose a location to save your Xcode project.

    3. Wait for Unity to generate the Xcode project.

    hashtag
    Step 6: Configure and Build in Xcode

    1. Open the generated Xcode project.

    2. In Xcode, select your project in the navigator.

    3. Select your target under the "TARGETS" section.

    4. Go to the "Signing & Capabilities" tab.

    5. Ensure that "Automatically manage signing" is checked.

    6. Select your Team from the dropdown (you need an Apple Developer account for this).

    7. If needed, change the Bundle Identifier to a unique string.

    Xcode window showing the Signing & Capabilities tab with Team and Bundle Identifier fields highlighted

    hashtag
    Step 7: Build and Run

    1. Connect your iOS device to your Mac.

    2. In Xcode, select your connected device as the build target.

    3. Click the "Play" button or press Cmd + R to build and run the app on your device.

    Xcode toolbar showing the connected device selected and the "Play" button highlighted

    hashtag
    Troubleshooting

    • If you encounter any build errors, ensure all the steps above have been followed correctly.

    • Check that your Apple Developer account has the necessary provisioning profiles and certificates.

    • If you face any GRPC-related issues, verify that the libgrpc_csharp_ext.a and libgrpc.a files are correctly placed in the Assets/Convai/Plugins/gRPC/Grpc.Core/runtime/ios folder.

    Verify the Problem:

  • Manually Allow Blocked DLLs:

    • Open System Preferences on your Mac.

    • Navigate to "Security & Privacy".

    • Under the "Security" tab, you might see a message at the bottom about the DLL being blocked. Click "Allow Anyway" or "Open Anyway" and enter password if asked.

  • Modify Gatekeeper settings: MacOS's Gatekeeper can prevent unidentified developers' software from running. To allow the DLL:

    • Open the Terminal (found in Applications > Utilities).

    • Type sudo spctl --master-disable and press Enter.

    • This command will allow apps to be downloaded from anywhere.

    • Now, try running the Unity project again.

    • After you're done, you should re-enable Gatekeeper with sudo spctl --master-enable to avoid any malware.

  • Check File Permissions: Ensure the DLL has the correct file permissions.

    • In Finder, right-click (or control-click) on the DLL file and choose "Get Info".

    • Under “Sharing & Permissions”, ensure that your user account has "Read & Write" permissions.

  • Review Unity's Plugin Settings:

    • In the Unity editor, select the DLL in the Project view.

    • In the Inspector window, make sure the appropriate platform (in this case, Mac OS X) and architecture (Apple Silicon, Intel-64) is selected for the DLL.

    • Ensure that the "Load on Startup" and other pertinent options are checked (should be enabled by default)

  • hashtag
    Mac Configuration in Player Settings during build

    • Update Mac Configuration:

      • In Unity, navigate to Edit > Project Settings > Player.

      • Scroll down and click on Other Settings

      • Scroll down again to find Mac Configuration section

      • Update the Mac Configuration section (follow the below Screenshot)

    Convai Playground
    Screenshot showing selection of character in Convai Playground
    Screenshot showing Icon of Narrative Design
    Screenshot showing location of Add Components button in the Convai NPC inspector panel
    Screenshot showing selection of Narrative design option in the Add Component Window
    Screenshot showing location of Add Component button in the inspector panel
    Screenshot showing which component to select from the search results
    Screenshot showing a sample Narrative Design component
    Screenshot showing various unity events user can subscribe to
    Screenshot showing a game object with a collider selected
    Screenshot showing selection of Narrative Design Trigger
    Screenshot showing Box Collider becoming a trigger box
    Screenshot showing assigning of Convai NPC to trigger component
    Screenshot showing ability to select your desired trigger

    Notification System

    Notification System - Implement notifications with Convai Unity plugin utilities.

    The Convai plugin comes with default notifications, totaling four. Here they are:

    hashtag
    Notifications

    hashtag
    Not Close Enough to the Character

    Appears when you press the talk button but there is no active NPC nearby.

    hashtag
    Talk Button Released Early

    Appears if you release the talk button in less than 0.5 seconds.

    hashtag
    Microphone Issue Detected

    Appears when the recorded audio input level is below the threshold.

    hashtag
    Connection Problem

    Appears when there is no internet connection upon launching the application.

    hashtag
    How to Add Your Own Notification?

    Adding your custom notification is straightforward.

    Let's go through the steps to add a " CharacterStartedListening" notification as an example.

    1. Open the script "Convai/Scripts/Notification System/Notification Type.cs." This script stores Notification Types as enums. Give a name to your desired Notification type and add it here.

    1. Right-click on "Convai / Scripts / Notification System / Scriptable Objects" and select "Create > Convai > Notification System > Notification" then create a "Notification Scriptable Object".

    1. Name the created Notification Scriptable Object. Click on it, and fill in the fields in the Inspector as desired.

    1. Add the created Notification Scriptable Object to "Convai/Scripts/Notification System/Scriptable Objects" under "Convai Default Notification Group" (details of Notification Group here****).

    1. Your Notification is now ready. The last step is to call this Notification from where you need it. For example, if you created the " CharacterStartedTalking " Notification, find the location where your character listens and write the code.

    1. Replace the parameter with the NotificationType you created. (For our example, NotificationType.CharacterStartedListening)

    2. Ensure that the Convai Notification System is present in your scene. (accessible from "Convai/Prefabs/ Notification System")

    All steps are complete, and you're ready to test!🙂✅

    hashtag
    Notification Scriptable Object

    This Scriptable Object stores information about a Notification

    • Notification Type

    • Notification Icon

    • Notification Title

    To create a new Notification Scriptable Object, right-click anywhere in the Project Window and select "Create > Convai > Notification System > Notification"

    hashtag
    Notification Group Scriptable Object

    This Scriptable Object stores Notification Scriptable Objects as groups. When a Notification is requested, it searches for the Notification using the specified Notification Group in the Convai Notification System prefab's Notification System Handler script.

    You can create different Notification groups based on your needs. Note: If your referenced Notification Group does not have the Notification you want, that Notification won't be called.

    The Convai Default Notification Group has four Notifications, but you can add more or create a new group with additional notifications.

    hashtag

    UpdateJawBoneRotation(
    new Vector3(
            0.0f, 
            0.0f, 
            -90.0f - CalculateBoneEffect(FacialExpressionData.JawBoneEffector) * 30f
        )
    );
    UpdateTongueBoneRotation(
    new Vector3(
            0.0f,
            0.0f,
            CalculateBoneEffect(FacialExpressionData.TongueBoneEffector) * 80f - 5f
        )
    );
    <linker>
      <assembly fullname="UnityEngine">
        <type fullname="UnityEngine.Application" preserve="fields">
          <property name="platform"/>
        </type>
      </assembly>
    </linker>
    #if UNITY_EDITOR && UNITY_IOS
    using System.IO;
    using UnityEditor;
    using UnityEditor.Callbacks;
    using UnityEditor.iOS.Xcode;
    using UnityEngine;
    
    public class iOSBuild : MonoBehaviour
    {
        [PostProcessBuild]
        public static void OnPostProcessBuild(BuildTarget target, string path)
        {
            string projectPath = PBXProject.GetPBXProjectPath(path);
            PBXProject project = new PBXProject();
            project.ReadFromString(File.ReadAllText(projectPath));
    #if UNITY_2019_3_OR_NEWER
            string targetGuid = project.GetUnityFrameworkTargetGuid();
    #else
            string targetGuid = project.TargetGuidByName(PBXProject.GetUnityTargetName());
    #endif
            project.AddFrameworkToProject(targetGuid, "libz.tbd", false);
            project.SetBuildProperty(targetGuid, "ENABLE_BITCODE", "NO");
            File.WriteAllText(projectPath, project.WriteToString());
        }
    }
    #endif
    if(convaiNPC.TryGetComponent(out NarrativeDesignTrigger narrativeDesignTrigger))
    {
        //Optional message parameter if you want to send some message while invoking
        //the trigger 
        string message = "Player has collected enough resources";
        narrativeDesignTrigger.InvokeSelectedTrigger(message);
    }
    this guide
    this guide
    Video series showing how to create Narrative Design
    Notification Message
    NotificationSystemHandler.Instance.NotificationRequest(NotificationType.CharacterStartedListening);

    Transcript UI System

    Transcript UI System - Integrate transcript UI with Convai's Unity plugin.

    hashtag
    Overview

    The Dynamic UI system is a feature within the Convai Unity SDK that provides developers a robust system for in-game communication. This feature allows for displaying messages from characters and players and supports various UI components for chat, Q&A sessions, subtitles, and custom UI types. This document will guide you through the integration, usage, and creation of custom UI types of the Dynamic UI feature in your Unity project.

    hashtag
    Usage

    hashtag
    Accessing the Chat UI Handler

    To interact with the chat system, you need to reference the ConvaiChatUIHandler in your scripts. You can find the Transcript UI prefab in the Prefabs folder.

    Here's an example of how to find and assign the handler:

    hashtag
    Sending Messages

    Once you have a reference to the ConvaiChatUIHandler, you can send messages using the following methods:

    Sending Player Text

    To send text as the player:

    • input: The string containing the player's message.

    hashtag
    Sending Character Text

    To send text as a character:

    • characterName: The name of the character sending the message.

    • currentResponseAudio.AudioTranscript: The transcript of the audio response from the character, trimmed of any leading or trailing whitespace.

    hashtag
    Adding Custom UI Types to the Dynamic Chatbox

    While the Dynamic UI system within the , you may want to create a custom UI that better fits the style and needs of your game and it designed to be extensible, allowing developers to add their custom UI types. This is achieved by inheriting from the ChatUIBase class and implementing the required methods. The ConvaiChatUIHandler manages the different UI types and provides a system to switch between them.

    hashtag
    Creating a Custom UI Class

    To create a custom UI type, follow these steps:

    hashtag
    Step 1: Define Your Custom Class

    Create a new C# script in your Unity project and define your class to inherit from ChatUIBase. For example:

    hashtag
    Step 2: Implement Required Methods

    Implement the abstract methods from ChatUIBase. You must provide implementations for Initialize, SendCharacterText, and SendPlayerText:

    hashtag
    Step 3: Add Custom Functionality

    Add any additional functionality or customization options that your custom UI may require.

    hashtag
    Step 4: Assign and Use Your Custom UI

    To use your custom UI class within the dictionaryConvaiChatUIHandler, you need to add it to the GetUIAppearances dictionary. This involves creating a prefab for your custom UI and assigning it in the ConvaiChatUIHandler.

    Here's an example of how to do this:

    1. Create a prefab for your custom UI and add your CustomChatUI component to it.

    2. Assign the prefab to a public variable in the ConvaiChatUIHandler script.

    3. Modify the InitializeUIStrategies

    1. Ensure that your custom UI type is added to the UIType enum:

    1. Now you can set your custom UI type as the active UI from the Settings Panel .

    By following these steps, you can integrate your custom UI type into the Dynamic Chatbox system and switch between different UI types at runtime.

    Creating a Profile

    Create and register a custom Lip Sync profile in Unity, understand profile fields, and configure supported transport formats for your project.

    hashtag
    Introduction

    A Lip Sync Profile defines the channel schema a character setup uses within the Lip Sync system. In most cases, the built-in profiles are enough. However, you may want to create a custom profile asset to better organize your project, use a project-specific identifier, or override how a supported transport format is represented in your Editor workflow.

    This page explains the Profile Inspector, the Profile Registry, and how to create a custom profile correctly.


    hashtag
    Before You Start

    Currently, Convai supports only these transport formats:

    Transport Format
    Supported Schema

    This is important because creating a new profile asset does not create a new transport format.

    A custom profile can help you:

    • Rename or reorganize a supported schema

    • Use a custom profile ID for your project

    • Point that custom profile to one of the supported formats

    A custom profile cannot be used to introduce an entirely new transport value outside arkit, mha, or cc4_extended.


    hashtag
    Understanding the Profile Inspector

    When you select a ConvaiLipSyncProfileAsset, the Inspector is divided into three main areas.

    hashtag
    Runtime Identity

    This section controls how the profile is identified internally.

    Profile ID A unique normalized string used at runtime to identify the profile.

    This ID is used for:

    • profile catalog lookup

    • map targeting

    • registry merging

    • component configuration

    Choose this carefully. Once other assets reference this ID, changing it can break those references.

    hashtag
    Editor Label

    Display Name This is the human-readable label shown in dropdowns and editor tools.

    It has no direct effect on runtime behavior, but it is important for usability. Use a clear name that your team will recognize immediately.

    hashtag
    Transport Format

    This section determines which supported transport format the profile resolves to.

    Override default token When disabled, the profile uses its own Profile ID as the transport token.

    When enabled, the profile can use a different supported transport token. This is useful when you want a custom internal profile ID, but still need the profile to resolve to one of the built-in supported formats.

    Transport Token The transport token must be one of the currently supported values:

    • arkit

    • mha

    • cc4_extended

    For example, a profile with ID my_metahuman_variant can still use the mha transport token.


    hashtag
    Create a Custom Profile

    1

    hashtag
    Create the profile asset

    In the Unity Project window, create a new profile asset:

    Give it a descriptive name, such as:

    2

    hashtag
    Understanding Profile Registries

    Profiles are discovered through Profile Registry assets.

    A registry is a ConvaiLipSyncProfileRegistryAsset that contains one or more profile references and a priority value used during runtime merging.

    hashtag
    Registry fields

    Field
    Description

    The built-in registry uses priority 0. Your own custom registry should use a higher value, such as 1, so it is merged after the built-in set.

    hashtag
    Register the Profile

    1

    hashtag
    Create a Profile Registry

    Create a registry asset in the Project window:

    Give it a name such as:

    2

    hashtag
    How Runtime Discovery Works

    When the Lip Sync profile catalog initializes, it:

    1. Loads the built-in registry

    2. Scans for additional registries under Resources/LipSync/ProfileRegistries/

    3. Sorts them by priority

    If two registries define the same Profile ID, the higher-priority definition replaces the lower-priority one and a warning is logged.


    hashtag
    Important Limitations

    Keep these points in mind when creating custom profiles:

    hashtag
    Profile IDs should be treated as permanent

    Once a profile is referenced by maps, registries, or components, changing the ID can silently break those references.

    hashtag
    Transport formats are fixed

    Only these transport formats are supported:

    • arkit

    • mha

    • cc4_extended

    Entering a completely custom value does not add support for a new format.

    hashtag
    Registry priority affects replacement behavior

    If two registries define the same profile ID, the higher-priority definition replaces the earlier one. There is no merge between duplicate IDs.


    hashtag
    Next Step

    After creating and registering a profile, the next step is to create or assign a map that targets it.

    Continue with to define how that profile drives your character's actual blendshapes.

    hashtag
    Conclusion

    A custom profile is primarily a way to organize and identify a supported Lip Sync schema inside your project. It gives you flexibility in naming and project structure, while still staying within the currently supported transport formats.

    If your character needs custom routing to mesh blendshapes, create a map next.

    circle-info

    Need help? For questions, please visit the .

    Lip Sync Profiles and Mappings

    Learn how Convai Lip Sync uses profiles and maps to drive real-time facial blendshape animation, how built-in defaults work, and when to create custom assets.

    hashtag
    Introduction

    Convai Lip Sync drives facial blendshape animation in real time by matching incoming speech animation channels to the blendshapes on your character. To make that work reliably, the system needs two things:

    • A Profile, which defines the channel schema the character uses

    • A Map, which tells the SDK how those channels connect to actual blendshape names on the mesh

    Together, these two assets make the Lip Sync pipeline predictable, editable, and easy to adapt in the Unity Editor.

    hashtag
    Overview

    At a high level, the Lip Sync system answers two questions:

    1. Which facial rig schema is active? This is defined by the Profile.

    2. How should each incoming channel affect this specific character mesh? This is defined by the Map.

    Both are stored as Unity ScriptableObject assets, so they can be inspected, assigned, and customized directly in the Editor.

    hashtag
    What Is a Profile?

    A Lip Sync Profile defines the expected channel layout for a facial rig. It acts as the schema for incoming facial animation data.

    For example, if a profile expects a channel called jawOpen, the system interprets that channel according to the rules of that profile. This allows the SDK to know what data is being sent and how to categorize it before any mesh-specific mapping happens.

    A profile is not tied to a single character. It defines a reusable facial rig format that multiple characters can share.

    hashtag
    What Is a Map?

    A Lip Sync Map connects profile channels to the actual blendshape names on a character's SkinnedMeshRenderer.

    This is what makes Lip Sync work on real character assets. Even if the incoming channel schema is valid, the animation cannot be applied correctly unless the system knows which mesh blendshape each channel should drive.

    A map can do more than simple one-to-one routing. It can also:

    • Route one source channel to multiple target blendshapes

    • Scale or offset individual channels

    • Clamp overly strong values

    hashtag
    How Profiles and Maps Work Together

    The flow is simple:

    1. A Lip Sync profile determines which channel schema is active

    2. A Lip Sync map reads channels from that schema

    3. The map writes the processed values to the target blendshapes on the character mesh

    This separation is important because it allows one profile to be reused across many different characters, while each character can still have its own map.

    For example, two characters may both use the arkit profile, but one may use the default map while another uses a custom map because its blendshape names differ.

    hashtag
    Supported Profile Formats

    Currently, Convai supports the following Lip Sync profile formats:

    Profile
    ID

    These are the only supported transport formats at this time.

    This means:

    • You can create custom profile assets inside your project

    • You can rename or organize profiles for your workflow

    • You can override which supported transport format a profile uses

    In other words, creating a custom profile does not add support for a new backend format. The transport format must still resolve to one of the supported values: arkit, mha, or cc4_extended.

    hashtag
    Built-in Profiles

    The SDK includes built-in profiles for the supported formats. These represent the standard schemas used by the Lip Sync system and are intended to be the authoritative built-in definitions.

    Each profile asset includes:

    Field
    Purpose

    Built-in profile assets are located under:

    hashtag
    Profile Registries

    Profiles are grouped into a Profile Registry rather than loaded one by one.

    The built-in registry is located at:

    At runtime, the SDK loads the built-in registry, scans for additional registries under the same Resources path, and merges them into a single catalog.

    Registries are merged by priority:

    • Lower priority values are processed first

    • Higher priority values can override existing profile IDs

    • Duplicate profile IDs produce a warning and the higher-priority definition wins

    This lets you extend or override profile definitions without editing built-in SDK assets directly.

    hashtag
    Built-in Default Maps

    The SDK also includes built-in default maps for supported profile types.

    These are located under:

    The built-in set includes:

    Asset
    Target Profile
    Purpose

    These default maps are designed to cover common use cases out of the box.

    hashtag
    Why some built-in channels are clamped or disabled

    Some built-in mappings intentionally reduce or suppress certain channels to keep results stable and natural on common character setups.

    Examples include:

    • Jaw open clamping to reduce exaggerated mouth motion

    • Eye rotation channel disabling for rigs that do not use blendshape-driven eye motion

    • Cosmetic channel disabling on rigs where those channels are not appropriate for speech animation

    hashtag
    Default Map Registry

    The Default Map Registry defines which default map is used automatically for each profile.

    It is located at:

    This registry maps each supported profile ID to its default ConvaiLipSyncMapAsset.

    hashtag
    How Map Resolution Works

    When a Lip Sync component initializes, the SDK determines which map to use through a fallback chain:

    1. Explicit map on the component If a map asset is assigned directly and its target profile matches the active profile, that map is used.

    2. Default map registry lookup If no valid explicit map is assigned, the system checks the Default Map Registry for the active profile.

    3. Safe disabled fallback If no valid map is found, the SDK creates a safe fallback that outputs zero values instead of animating the character.

    This behavior ensures that missing or mismatched setups fail safely without crashing the scene.

    hashtag
    When to Create Custom Assets

    You typically do not need custom assets if your character already matches one of the built-in supported formats and its blendshape names follow the expected naming convention.

    You should create a custom map when:

    • Your character uses different blendshape names

    • You need custom clamping or scaling

    • You want one source channel to drive multiple targets

    You may create a custom profile when:

    • You want a project-specific profile identity or label

    • You want to organize supported formats differently inside your project

    • You need a custom profile asset that still resolves to one of the currently supported transport formats

    For step-by-step instructions, continue with:

    hashtag
    Conclusion

    Profiles define the channel schema. Maps define how that schema drives a specific mesh.

    Once you understand that separation, the Lip Sync workflow becomes straightforward: choose the supported profile format that matches your character setup, then use either a built-in map or a custom one to connect those channels to your character's blendshapes.

    circle-info

    Need help? For questions, please visit the .

    Additional Feature Migration

    hashtag
    Additional Feature Migration

    hashtag
    LTM (Session Resume)

    No API migration is required. Continue enabling/disabling session resume as needed in your setup.

    hashtag
    Dynamic Info: DynamicInfoController -> ConvaiRoomManager

    Dynamic info APIs are now routed through ConvaiRoomManager.


    hashtag
    Narrative Design Migration (Legacy -> Current SDK)

    Narrative Design is still supported, but references now align with the new SDK architecture (ConvaiCharacter + modular narrative components). Legacy setup reference: .

    hashtag
    Narrative quick mapping

    • ConvaiNPC (old character component) -> ConvaiCharacter

    • Narrative Design Manager (legacy setup) -> Convai Narrative Design Manager (ConvaiNarrativeDesignManager)

    hashtag
    Narrative minimal migration steps

    1

    hashtag
    Replace legacy NPC component references

    Replace legacy NPC component references with ConvaiCharacter.

    2

    hashtag
    Script migration example (trigger invoke)

    hashtag
    Notes

    • Section/trigger lists are fetched per character ID, so always ensure the correct ConvaiCharacter is assigned before syncing/fetching.

    • InvokeTrigger() sends the currently configured trigger name + optional message.

    • For fully code-driven flows, you can call convaiCharacter.SendTrigger(triggerName, message)


    hashtag
    Transcript UI Migration (Legacy Dynamic UI -> ChatTranscriptUI)

    The transcript UI architecture changed from a direct push model to a view-model based flow.

    hashtag
    What changed

    • Old model: UI classes pushed text directly using ConvaiChatUIHandler, ChatUIBase, and UIType.

    • New model: UI is a thin view implementing ITranscriptUI; routing/aggregation happens in controller and presentation strategy layers.

    hashtag
    Quick mapping

    • ConvaiChatUIHandler -> TranscriptUIController + presentation strategy

    • Custom class derived from ChatUIBase -> MonoBehaviour implementing ITranscriptUI

    hashtag
    Minimal migration steps

    1

    hashtag
    Create a new script

    Create a new script (for example, MyGameTranscriptUI.cs).

    2

    hashtag
    Important behavior notes

    • In-progress messages are typically keyed by speaker while streaming.

    • CompleteMessage(messageId) finalizes a bubble and removes it from the active in-progress map.

    • Text submission generally flows through IPlayerInputService.

    hashtag
    Common pitfall

    If no transcript messages appear, verify:

    • The UI is active (SetActive(true)).

    • Identifier matches the transcript mode expected by your controller setup (for example, "Chat").


    hashtag
    Prebuilt UI Prefabs

    The new SDK includes prebuilt UI prefabs you can use directly or customize as needed:

    • Settings Panel Prefab: Packages/com.convai.convai-sdk-for-unity/Prefabs/SettingsPanel/SettingsPanel_Landscape.prefab

    • Transcript Chat Prefab: Packages/com.convai.convai-sdk-for-unity/Prefabs/TranscriptUI/TranscriptUI_Chat.prefab

    • Notification Prefab:

    For teams migrating from the old SDK docs, this information was previously listed under .


    hashtag
    Migration Complete

    After completing the steps above:

    • Project uses the latest Convai SDK.

    • NPC interaction runs through ConvaiCharacter.

    • Scene defaults run through ConvaiDefaults.

    If you face issues after migration, check:

    • Missing script references.

    • API usage updates in your custom scripts.

    • Audio source setup on character objects.

    External API

    Learn how to integrate and configure the External API feature to enable your characters to access real-time information, create tasks, and interact with third-party platforms.

    hashtag
    Introduction

    The External API feature empowers your characters to interact intelligently with real-time data sources and third-party services. Whether it’s retrieving live weather updates, tracking sports scores, or creating tickets in platforms like Jira and Trello, this feature allows seamless API-based integration. With just a few configuration steps, your characters can fetch data, trigger workflows, and execute automated actions, making them significantly more capable.


    Adding Actions to your Character

    Follow these instructions to enable actions for your Convai-powered characters.

    hashtag
    Setting Up Action Configurations

    1. Select the Convai NPC character from the hierarchy.

    method in the
    ConvaiChatUIHandler
    script to include your custom UI type.
    SDK provides several pre-built UI types
    Settings Panel

    hashtag
    Set the Profile ID

    In the Runtime Identity section, enter a unique ID.

    Example:

    Use lowercase letters, numbers, and underscores. Avoid spaces.

    3

    hashtag
    Set the Display Name

    In the Editor Label section, enter the display name that should appear in the Inspector.

    Example:

    4

    hashtag
    Set the transport format

    Choose which supported Lip Sync schema this profile should use.

    Examples:

    • Use arkit for ARKit-compatible blendshape layouts

    • Use mha for MetaHuman rigs

    • Use cc4_extended for CC4 Extended rigs

    If your custom profile ID does not match one of those supported tokens, enable Override default token and enter the correct supported transport token manually.

    hashtag
    Set the registry priority

    Set Priority to a value higher than the built-in registry.

    Recommended starting value:

    3

    hashtag
    Add the profile to the registry

    Add your new ConvaiLipSyncProfileAsset to the Profiles list.

    4

    hashtag
    Place the registry in the correct Resources path

    For the SDK to discover it automatically, the registry must be placed under:

    Once the asset is saved there, Unity will include it on the next domain reload or Play Mode refresh.

    Merges all discovered profiles into one runtime catalog

    arkit

    ARKit

    mha

    MetaHuman

    cc4_extended

    CC4 Extended

    Priority

    Determines load and override order

    Profiles

    List of profile assets included in the registry

    Creating a Custom Map
    Convai Developer Forumarrow-up-right
    Disable channels that should not be driven
  • Optionally allow unmapped names to pass through directly

  • You cannot introduce a brand-new transport format by entering a custom value

    cc4_extended

    Identity-style default mapping for CC4 Extended channels

    ConvaiLipSyncDefaultMap_ARKitToCC4Extended

    arkit

    Cross-rig translation from ARKit channels to CC4 Extended blendshape names

    You want more control over which channels are enabled

    ARKit

    arkit

    MetaHuman

    mha

    CC4 Extended

    cc4_extended

    Profile ID

    Internal runtime identifier

    Display Name

    Human-readable label shown in the Editor

    Transport Format

    Supported format token used by the active pipeline

    ConvaiLipSyncDefaultMap_ARKit

    arkit

    Identity-style default mapping for ARKit channels

    ConvaiLipSyncDefaultMap_MetaHuman

    mha

    Identity-style default mapping for MetaHuman channels

    Creating a Profile
    Creating a Custom Map
    Convai Developer Forumarrow-up-right

    ConvaiLipSyncDefaultMap_CC4Extended

    Narrative Design Trigger (legacy setup) -> Convai Narrative Design Trigger (ConvaiNarrativeDesignTrigger)

  • InvokeSelectedTrigger(message) -> SetTriggerMessage(message) + InvokeTrigger()

  • Direct trigger call remains available on character via SendTrigger(triggerName, message)

  • hashtag
    Add Convai Narrative Design Manager to character

    Add Convai Narrative Design Manager to the character object (or assign the character in the manager).

    3

    hashtag
    Sync with backend

    Click Sync with Backend in the manager inspector to fetch sections for that character.

    4

    hashtag
    Re-bind section events

    Re-bind section events (On Section Start, On Section End) in the manager.

    5

    hashtag
    Add Narrative Design Trigger to trigger objects

    Add Convai Narrative Design Trigger to trigger objects and assign the same ConvaiCharacter.

    6

    hashtag
    Fetch triggers and configure activation

    Click Fetch in the trigger inspector, select a trigger, and configure activation mode (Collision/Proximity/Manual/TimeBased).

    directly.

    Result: custom UI should mainly render TranscriptViewModel.

    SendCharacterText(...) / SendPlayerText(...) -> DisplayMessage(TranscriptViewModel viewModel)

  • Finalize message -> CompleteMessage(string messageId)

  • Clear transcript/chat -> ClearAll()

  • UI activation per type -> Identifier + SetActive(bool active)

  • hashtag
    Use reference implementation

    Use SDK/Runtime/Presentation/Views/Transcript/Chat/ChatTranscriptUI.cs as reference.

    3

    hashtag
    Implement interfaces

    Implement MonoBehaviour + ITranscriptUI (and IInjectable if service injection is needed).

    4

    hashtag
    Keep required members

    Keep required members:

    • Identifier

    • IsActive

    • DisplayMessage(TranscriptViewModel viewModel)

    • CompleteMessage(string messageId)

    • ClearAll()

    • SetActive(bool active)

    • CompletePlayerTurn()

    5

    hashtag
    Inject services if needed

    If needed, inject services via InjectServices(IServiceContainer container):

    • IConvaiCharacterLocatorService

    • IPlayerInputService

    6

    hashtag
    Rewire prefab references

    Rewire prefab references (bubble prefab, container, input field, fade components) and assign the new component where transcript UIs are registered.

    Character colors are resolved through IConvaiCharacterLocatorService.

    Packages/com.convai.convai-sdk-for-unity/Prefabs/Notifications/Notification.prefab

    Transcript UI follows the new ITranscriptUI pipeline.

    Transcript UI activation and identifier matching.
    Adding Narrative Design to your Characterarrow-up-right
    Convai UI Prefabsarrow-up-right
    private ConvaiChatUIHandler _convaiChatUIHandler;
    
    private void OnEnable()
    {
        // Find and assign the ConvaiChatUIHandler component in the scene
        _convaiChatUIHandler = ConvaiChatUIHandler.Instance;
        if (_convaiChatUIHandler != null) _convaiChatUIHandler.UpdateCharacterList();
    }
    _convaiChatUIHandler.SendPlayerText(input);
    _convaiChatUIHandler.SendCharacterText(characterName, currentResponseAudio.AudioTranscript.Trim());
    using Convai.Scripts.Utils;
    using UnityEngine;
    
    public class CustomChatUI : ChatUIBase
    {
        // Implement the required methods from ChatUIBase here.
    }
    public override void Initialize(GameObject uiPrefab)
    {
        // Instantiate and set up your custom UI prefab here.
    }
    
    public override void SendCharacterText(string charName, string text, Color characterTextColor)
    {
        // Handle sending character text to your custom UI here.
    }
    
    public override void SendPlayerText(string playerName, string text, Color playerTextColor)
    {
        Handle sending player text to your custom UI here.
    }
    [Tooltip (Prefab for the customChatUI.")]
    public GameObject customChatUIPrefab;
    
    private void InitializeUIStrategies()
    {
        Existing UI types
        InitializeUI(chatBoxPrefab, UIType.ChatBox);
        InitializeUI(questionAnswerPrefab, UIType.QuestionAnswer);
        InitializeUI(subtitlePrefab, UIType.Subtitle);
    
        // Custom UI type
        InitializeUI(customChatUIPrefab, UIType.Custom); // Make sure to define UIType.Custom in the UIType enum
    }
    
    private void InitializeUI (GameObject uiPrefab, UIType uiType)
    {
        // existing code...
    
        Add your custom UI initialization here
        if (uiType == UIType.Custom)
        {
            CustomChatUI customUIComponent = uiPrefab.GetComponent<CustomChatUI>();
            if (customUIComponent == null)
            {
                Debug.LogError("CustomChatUI component not found on prefab.");
                return;
            }
    
            customUIComponent.Initialize(uiPrefab);
            GetUIAppearances[uiType] = customUIComponent;
        }
    }
    public enum UIType
    {
        ChatBox,
        QuestionAnswer,
        Subtitle,
        CustomUI // Your custom UI type
    }
    my_character
    My Character
    1
    Resources/LipSync/ProfileRegistries/
    Create > Convai > LipSync > Profile Asset
    ConvaiLipSyncProfile_MyCharacter
    Create > Convai > LipSync > Profile Registry Asset
    MyProjectProfileRegistry
    Resources/LipSync/Profiles/
    Resources/LipSync/ProfileRegistries/LipSyncBuiltInProfileRegistry
    Resources/LipSync/DefaultMaps/
    Resources/LipSync/DefaultMaps/LipSyncDefaultMapRegistry
    // Old
    public class PlayerHealth : MonoBehaviour
    {
        [SerializeField] private DynamicInfoController _dynamicInfoController;
        private int _health = 100;
    
        private void Start()
        {
            _dynamicInfoController.SetDynamicInfo("Player Health is " + _health);
            Debug.Log("Player Health is " + _health);
        }
    }
    
    // New
    public class PlayerHealth : MonoBehaviour
    {
        [SerializeField] private ConvaiRoomManager _convaiRoomManager;
        private int _health = 100;
    
        private void Start()
        {
            _convaiRoomManager.SendDynamicInfo("Player Health is " + _health);
            Debug.Log("Player Health is " + _health);
        }
    }
    // Old
    if (convaiNPC.TryGetComponent(out NarrativeDesignTrigger narrativeDesignTrigger))
    {
        string message = "Player has collected enough resources";
        narrativeDesignTrigger.InvokeSelectedTrigger(message);
    }
    
    // New
    if (convaiCharacter.TryGetComponent(out ConvaiNarrativeDesignTrigger narrativeDesignTrigger))
    {
        string message = "Player has collected enough resources";
        narrativeDesignTrigger.SetTriggerMessage(message);
        narrativeDesignTrigger.InvokeTrigger();
    }
    hashtag
    Configuration and Usage

    hashtag
    1. Accessing the External API Page

    Navigate to the External API section in your dashboard. Here you can view existing API methods, activate or deactivate them, and create new methods. To add a new API method, click Add API Method.


    hashtag
    2. Creating an API Method

    Method Fields Overview

    • Method Name – Select an existing template or enter a unique name for your method.

    • Method Description – Provide a concise explanation of the method’s functionality.

    • Input Description (JSON Format) – Define required input parameters and their descriptions.

    • Implementation Code – Write the Python implementation for your API logic.

    • Inputs – Enter test parameters for validating your method.

    • Output – Displays the result when you click Test API.


    hashtag
    Example 1 – Get Weather Data

    Method Name Get Weather

    Method Description Fetches current weather data for a given city

    Input Description

    Implementation Code

    Setup Notes

    1. Sign up at OpenWeatherMaparrow-up-right and get your API key.

    2. Replace <your-api-key> in the code with your key.

    Test Input

    Click Test API.

    A successful Output Example:

    hashtag
    Activate the method

    If the test passes, click Save Changes, return to the main API list, and enable the method by toggling Connect to green.

    hashtag
    Test with a character

    Once activated, test the method in a conversation with your character. As seen in the screenshot below, the character correctly returned the current weather for Roma and Wrangell.


    hashtag
    Example 2 – Create Jira Support Ticket

    Method Name Creating Support Tickets

    Method Description Creates a support ticket on Jira

    Input Description

    Implementation Code

    Where to Find Required Values

    • JIRA_DOMAIN – Found in your Jira account URL. Example: https://mycompany.atlassian.net → JIRA_DOMAIN = "mycompany.atlassian.net"

    • EMAIL – Your Atlassian login email.

    • API_TOKEN – Create from Atlassian API Tokensarrow-up-right.

    • JIRA_PROJECT_KEY – Found in your project URL or next to the project name.

    • ISSUE_TYPE – Must be valid in your Jira project (Story, Task, Bug).

    Test Input

    Click Test API.

    A successful Output Example:

    Activate the method If the test passes, click Save Changes, return to the main API list, and enable the method by toggling Connect to green.

    Test with a character Once activated, test the method in a conversation with your character. As seen in the screenshot below, the character successfully created a Jira ticket and returned the ticket key.


    hashtag
    Limitations and Supported Environment

    circle-exclamation

    Supported LLM Models: GPT-4o, GPT-4o-mini, Claude-3.5, Claude 4.0

    circle-exclamation

    Max Execution Time: 5 seconds

    circle-exclamation

    Python Version: 3.12

    circle-exclamation

    Libraries Available: Standard library + requests


    hashtag
    Conclusion

    By configuring the External API feature, you can transform your characters into powerful, data-driven assistants. From retrieving real-time weather information to creating Jira tickets directly from a conversation, the possibilities are vast. This integration capability enables highly interactive, automated, and intelligent workflows.

    Scroll down to the ConvaiNPC script attached to your character.
  • Click the "Add Component" button.

    1. Use the checkbox to add the action script to the NPC Actions.

    2. Click "Apply Changes" to confirm.

    hashtag
    Pre-defined Actions

    Convai offers predefined actions for a quick start.

    1. Click the "+" button to add a new action.

    2. From the dropdown menu, select "Move To."

    1. Enter the action name as "Move To" (the name doesn't have to match the action choice name).

    2. Leave the Animation Name field empty for now.

    Repeat these steps to add more actions like "Pickup" and "Drop" etc.

    hashtag
    Adding an Object in the Scene

    1. Add any object into the scene—a sphere, a cube, a rock, etc.—that can be interacted with

    2. Resize and place the object in your scene.

    hashtag
    Adding the Convai Interactables Data Script

    • Create an empty GameObject and name it "Convai Interactables."

    • Attach the Convai Interactables Data script to this GameObject.

    • Add characters and objects to the script by clicking the "+" button and attaching the corresponding GameObjects.

    Convai Interactables Setup
    • Add the "There" object in Objects list, so that we can use the Dynamic Move Target indicator.

    • Add the Dynamic Move Target Indicator and setup NavMesh agent to you NPC.

    hashtag
    Setting Up NavMesh

    To ensure your NPCs can navigate the scene:

    1. Bake a NavMesh for your scene if you haven't already:

      • Go to Window > AI > Navigation.

      • In the Navigation window, under the Bake tab, adjust the settings as needed.

      • Click "Bake" to generate the NavMesh.

    2. Ensure that the NPC character has a NavMeshAgent component:

      • If not already attached, click "Add Component" and search for NavMeshAgent.

      • Adjust the Agent Radius, Speed, and other parameters according to your NPC's requirements.

    hashtag
    Adding a Dynamic Move Target Indicator

    To visually indicate where your NPC will move:

    • Create a new empty GameObject in the scene and name it accordingly or use the pre-made prefab named Dynamic Move Target Indicator.

    • Link this Move Target Indicator to your NPC's action script so it updates dynamically when you point the cursor to the ground and ask the NPC to move to "There".

    hashtag
    Test the Setup

    1. Click "Play" to start the scene.

    2. Ask the NPC, "Bring me the Box."

    3. If setup properly, the NPC should walk upto the box and bring it to you

    circle-exclamation

    This feature is currently experimental and can misbehave. Feel free to try it out and leave us any feedback.

    hashtag
    Adding Custom Actions to Your Unity NPC in Convai

    hashtag
    Introduction

    Make your NPC perform custom actions like dancing.

    hashtag
    Action that Only Requires an Animation

    1. Locate the dance animation file within our plugin.

    2. Incorporate this animation into your NPC's actions.

    hashtag
    Setting Up the Animator Controller

    1. Open the Animator Controller from the Inspector window.

    2. Drag and drop the dance animation onto the controller, creating a new node named "Dancing."

    hashtag
    Adding custom Animation Action

    1. Go to the Action Handler Script attached to your Convai NPC.

    2. Add a new action named "Dancing."

    3. In the Animation Name field, enter "Dancing" (it must exactly match the Animator Controller node name).

    4. Leave the enum as "None."

    hashtag
    Testing the Custom Action

    1. Click "Play" to start the scene.

    2. Instruct the NPC, "Show me a dance move," and the NPC should start dancing.

    hashtag
    Creating Complex Custom Actions in Unity with Convai: Throwing a Rock

    hashtag
    Introduction

    Adding advanced custom actions, such as a throw action, to your NPC.

    hashtag
    Animation Requirement

    1. Grab a throw animation from Mixamoarrow-up-right or anywhere you like.

    2. Import it into Unity.

    hashtag
    Setting Up the Animator Controller

    1. Drag and drop the throw animation onto the controller, creating a new node named "Throwing." (Follow steps in Action that Only Requires an Animation)

    hashtag
    Action Handler Script Setup

    1. Add the "Throw" enum to the script.

    2. In the "Do Action" function, add a switch case for the throw action.

    3. Define the "Throw()" function.

    hashtag
    Adding the Throw Action

    1. Add a new action named "Throw" and select the "Throw" enum.

    2. Leave the animation name field empty.

    hashtag
    Adding the Object (Rock) to the Convai Interactables Data script

    1. Add any rock prefab into the scene.

    2. Add the rock to the Convai Interactable Data script.

    hashtag
    Adding a location to Convai Interactables Data script

    1. Add a stage/new location in the ground of the scene.

    2. Add that new location game object in the Convai Interactable Data.

    hashtag
    Testing the Complex Action

    1. Click "Play" to start the scene.

    2. Instruct the NPC, "Pick up the rock and throw it from the stage."

    3. If everything is set up properly, the NPC should pick up the rock and throw it from the stage.


    {
        "parameters": {
            "city": {
                "type": "string",
                "description": "Name of the city to get weather information for (e.g., 'London', 'New York', 'Tokyo')"
            }
        },
        "required": [
            "city"
        ]
    }
    import requests
    
    API_KEY = "<your-api-key>"
    
    def handle_event(data):
        city = data.get("city")
        url = f"https://api.openweathermap.org/data/2.5/weather?q={city}&appid={API_KEY}"
        response = requests.get(url)
        weather_data = response.json()
        return {"weather": weather_data["weather"][0]["description"]}
    {
      "city": "New York"
    }
    {
      "weather": "clear sky"
    }
    {
        "parameters": {
            "summary": {
                "type": "string",
                "description": "Short title for the Jira ticket"
            },
            "description": {
                "type": "string",
                "description": "Detailed description of the Jira issue"
            }
        },
        "required": [
            "summary",
            "description"
        ]
    }
    import requests
    from requests.auth import HTTPBasicAuth
    import json
    
    # Jira configuration
    JIRA_DOMAIN = "mycompany.atlassian.net"  # Replace with your Jira domain
    EMAIL = "[email protected]"               # Replace with your Atlassian account email
    API_TOKEN = "abc123xyz456..."            # Replace with your Jira API token
    JIRA_PROJECT_KEY = "EX"                  # Replace with your Jira project key
    ISSUE_TYPE = "Story"                     # Issue type: Story, Task, or Bug
    
    API_ENDPOINT = f"https://{JIRA_DOMAIN}/rest/api/3/issue"
    
    # Standard headers required by JIRA.
    headers = {"Accept": "application/json", "Content-Type": "application/json"}
    
    def create_jira_ticket(ticket_data):
        """
        Create a JIRA ticket using provided ticket_data dictionary.
    
        Expected ticket_data keys:
          - summary: (str) Brief summary of the issue.
          - description: (str) Detailed description of the issue.
    
        Returns:
          - JSON response if the ticket is created successfully.
          - Error message if there was an error.
        """
        # Convert plain text description to Atlassian Document Format
        description_adf = {
            "version": 1,
            "type": "doc",
            "content": [
                {
                    "type": "paragraph",
                    "content": [
                        {"type": "text", "text": ticket_data.get("description", "")}
                    ],
                }
            ],
        }
    
        # Construct the payload for the JIRA issue
        payload = {
            "fields": {
                "project": {"key": JIRA_PROJECT_KEY},
                "summary": ticket_data.get("summary"),
                "description": description_adf,
                "issuetype": {"name": ISSUE_TYPE},
            }
        }
    
        # Convert the payload to a JSON string
        payload_json = json.dumps(payload)
    
        # Send a POST request to the JIRA API endpoint
        response = requests.post(
            API_ENDPOINT,
            data=payload_json,
            headers=headers,
            auth=HTTPBasicAuth(EMAIL, API_TOKEN),
        )
    
        # Check for a successful creation (HTTP 201 Created)
        if response.status_code == 201:
            return response.json()
        else:
            return {"error": f"Failed to create ticket: {response.status_code}"}
    
    
    def handle_event(data):
        return create_jira_ticket(data)
    {
      "summary": "This is to test ticket creation",
      "description": "Created using External API"
    }
    {
      "id": "10004",
      "key": "EX-5",
      "self": "https://mycompany.atlassian.net/rest/api/3/issue/10004"
    }

    Creating a Custom Map

    Create a custom Lip Sync map, understand the Map Inspector, and connect supported profile channels to your character's blendshapes in Unity.

    hashtag
    Introduction

    A Lip Sync Map defines how incoming Lip Sync channels are routed to the blendshapes on a specific character mesh.

    You need a custom map when your character does not follow the built-in blendshape naming conventions, or when you want more control over how specific channels behave.

    This page walks through the Map Inspector and shows how to build a custom map from scratch.

    hashtag
    Before You Start

    Before creating a map, make sure you already know which supported profile format your character uses.

    Currently supported profile formats are:

    • arkit

    • mha

    • cc4_extended

    Your map must target the correct profile. A map only works correctly when its target profile matches the active Lip Sync profile used by the character setup.

    hashtag
    Understanding the Map Inspector

    When you select a ConvaiLipSyncMapAsset, the Inspector is divided into several sections.

    hashtag
    Header

    At the top of the Inspector, you will see a summary of the current mapping state.

    Counter
    Meaning

    The profile badge indicates which profile this map targets.

    hashtag
    Configuration Section

    This section defines the map identity and global behavior.

    hashtag
    Target Profile

    Select the profile that this map is built for.

    This must match the profile used by the Lip Sync component.

    hashtag
    Description

    An optional editor-only note for your own project organization.

    hashtag
    Global Modifiers

    These settings affect the output of the map as a whole.

    Setting
    Description

    A global multiplier around 0.8 is often a good starting point for natural-looking results on many rigs.

    hashtag
    Allow Unmapped

    When enabled, channels without explicit entries can be forwarded directly using the source channel name as the target blendshape name.

    This can be useful during setup or testing, especially when your character already follows most of the expected naming convention.

    hashtag
    Tools Section

    This section helps populate or import mappings more quickly.

    hashtag
    From Mesh: Auto Detect

    This is the fastest way to generate mappings for a real character.

    1. Add a preview mesh using a SkinnedMeshRenderer

    2. Choose a matching mode

    3. Run Auto-Detect From Mesh

    The SDK compares the mesh blendshape names to the profile source channels and tries to match them automatically.

    Matching modes

    Mode
    Behavior

    Recommended workflow:

    1. Start with Exact

    2. If coverage is low, try Contains

    3. If needed, try Fuzzy

    hashtag
    From Mapping Text

    You can also import mapping data from JSON.

    Available options include:

    • importing a mapping file

    • pasting mapping text directly

    • copying the current mapping as JSON

    This is useful for team workflows, backup, and migration.

    hashtag
    Mapping Actions

    Action
    Result

    hashtag
    Bulk Operations

    Bulk tools help you manage large maps quickly.

    Action
    Result

    These operations are especially useful when debugging or isolating part of a face rig.

    hashtag
    Mappings Section

    This is the main routing table of the map.

    Each row is a mapping entry that connects one source channel to one or more target blendshapes.

    Column
    Description

    You can search the list, filter by enabled entries, and isolate unmapped items to finish setup faster.

    hashtag
    Mapping Entry Behavior

    Each mapping entry can include additional controls beyond its visible table fields.

    Field
    Description

    A single source channel can also drive multiple target blendshapes. This is useful when one expression needs to affect several shapes on the mesh.

    hashtag
    Output Processing Order

    The final output value is calculated in this order:

    If Ignore Global Modifiers is enabled, the last two steps are skipped for that entry.

    hashtag
    Create a Custom Map

    1

    hashtag
    Create the map asset

    In the Project window, create a new map asset:

    Use a descriptive name such as:

    2

    hashtag
    Practical Tips

    hashtag
    Use coverage as a setup indicator

    Coverage is one of the fastest ways to judge how complete your mapping is.

    hashtag
    Start simple

    Begin with identity mapping or auto-detect, then refine only the entries that actually need adjustment.

    hashtag
    Disable what your rig does not support

    If your character has no relevant target for a channel, disabling that entry is often cleaner than leaving it partially configured.

    hashtag
    Tune jaw motion carefully

    Jaw-related channels often benefit from clamping so that speech stays expressive without becoming exaggerated.

    hashtag
    Conclusion

    A custom map gives you precise control over how supported Lip Sync channels drive a specific character mesh. Once the target profile is set correctly, the map becomes the layer that turns incoming facial data into stable, character-specific animation inside Unity.

    circle-info

    Need help? For questions, please visit the .

    Troubleshooting Guide

    Troubleshoot common issues with Convai's Unity plugin. Get solutions for seamless AI integration.

    hashtag
    Common Issues (FAQ)

    hashtag
    Q. I cannot see the Convai menu.

    A. Please check if there are any errors in the console. Unity needs to be able to compile all the scripts to be able to display any custom editor menu options. Resolving all the console errors will fix this issue.

    hashtag
    Q. There are a lot of errors on my console.

    A. Primarily, two issues cause errors in the console that can stem from the Convai Unity Plugin. You can use the links below to fix them quickly.

    hashtag
    Q. I am talking to the character, but I cannot see the user transcript and the character does not seem to be coherently responding to what I am saying.

    A. This may indicate issues with the microphone. Please ensure that the microphone is connected correctly. You also need to ensure that the applications have permission to access the menu.

    hashtag
    Q. The animations for my characters are looking very weird.

    A. The animation avatar that we are using might be incompatible with the character mesh. Fixing that can solve the issue.

    hashtag
    Q. There are two Settings Panel Buttons in Mobile Transcript UI.

    A. If you are using Unity 2021, unexpected prefab variant issues may arise. This is because Unity Mobile Transcript UIs are variants of the main transcript UI prefab. With changes in the Prefab system in Unity 2022, it works correctly in Unity 2022. If you are using Unity 2021, you may encounter issues with prefabs. You can remove the redundant Settings Panel Button to address this problem.

    hashtag
    Q: The lipsync is very faint or not visible.

    A: The animations that we are using may be modifying facial animations. Editing the animations to remove facial animations should fix any issues related to lipsync.

    A: The script also needs the avatar to not be mapped to the jaw bone to be manipulate the jaw bones itself.

    hashtag
    Q: I'm facing security permission issues using the grpc_csharp_ext.bundle DLL inside the Unity Editor on MacOS

    A: macOS's strict security measures can block certain external unsigned DLLs. To address this, you can manually allow the DLL in "Security & Privacy" settings, modify Gatekeeper's settings through Terminal, ensure correct file permissions for the DLL, check its settings in Unity, and update the Mac Configuration in Unity's Player Settings

    hashtag
    Q: I'm not able to talk to my character after building my Unity project for macOS (Intel64+Apple Silicon builds), especially on Intel Macs

    A: The issue is rooted in the grpc_csharp_ext.bundle used in Unity for networking. This DLL has separate versions optimized for Intel and Apple Silicon architectures. When trying to create a Universal build that serves both, compatibility problems arise, especially on Intel Macs. Presently, the best solution is to use Standalone build settings specific to each architecture.

    hashtag
    Error Index

    Follow this Table to navigate to our most common errors.

    Name
    Sample Error
    Reason for Error

    For any other issues, feel free to contact us on the .

    Enable Eyes Only

    Enables only eye-related channels

    Enable Mouth Only

    Enables only mouth-related channels

    Enable Brows Only

    Enables only brow-related channels

    Clamp Min / Max

    Limits the final output range

    hashtag
    Select the target profile

    In the Configuration section, set the Target Profile to the profile your character uses.

    3

    hashtag
    Populate the entries

    You can choose one of two common workflows.

    Option A: Auto-detect from mesh

    1. Add your SkinnedMeshRenderer as the preview mesh

    2. Choose Exact mode first

    3. Run Auto-Detect From Mesh

    4. Review the header coverage result

    5. If needed, retry with Contains or Fuzzy

    Option B: Initialize defaults and edit manually

    1. Click Initialize Defaults

    2. Review the generated identity-style entries

    3. Replace target names wherever your mesh uses different blendshape names

    4

    hashtag
    Tune the motion

    Adjust the map until the character behaves naturally.

    Common adjustments include:

    • lowering the global multiplier if expressions feel too strong

    • adding per-entry clamping for channels like jaw open

    • disabling channels your rig should not use

    • using fan-out when one source should drive multiple targets

    5

    hashtag
    Assign the map

    Once the map is ready, assign it to the Lip Sync Map field on your character's Lip Sync component.

    When a valid custom map is assigned and its target profile matches, it takes precedence over the built-in default map.

    Total

    Total number of mapping entries

    Enabled

    Number of active entries

    Mapped

    Number of entries with at least one assigned target

    Coverage

    Multiplier

    Scales all output values

    Offset

    Adds a constant value to all output values

    Exact

    Names must match exactly, ignoring case

    Contains

    One name can contain the other

    Fuzzy

    Common rig prefixes are stripped before comparison

    Initialize Defaults

    Creates identity-style entries for the selected profile

    Clear All

    Removes all entries

    Sort A-Z

    Sorts entries alphabetically

    Copy Mapping JSON

    Enable All

    Enables every entry

    Disable All

    Disables every entry

    Reset Multipliers

    Resets all per-entry multipliers to 1.0

    Reset Offsets

    Source Blendshape

    The incoming channel name

    Target Name(s)

    One or more mesh blendshape targets

    Mult

    Per-entry multiplier

    Offs

    Enabled

    Turns the entry on or off

    Use Override Value

    Replaces the incoming value with a fixed value

    Override Value

    Constant value used when override is enabled

    Ignore Global Modifiers

    Convai Developer Forumarrow-up-right

    Percentage of enabled entries that are mapped

    Copies the current map as JSON

    Resets all per-entry offsets to 0

    Per-entry offset

    Skips the map-wide multiplier and offset

    rawValue -> per-entry multiplier -> per-entry offset -> clamp
             -> global multiplier -> global offset
    Create > Convai > LipSync > Map Asset
    LipSyncMap_MyCharacter

    Our plugin needs Newtonsoft Json as a dependency. It is often present as part of Unity but occasionally, it can be missing.

    Missing Animation Rigging

    We use the Animation Rigging package for Eye and Neck tracking. If Unity does not automatically add it, we need to add it manually from the package manager.

    Microphone Permission Issues

    The microphone icon lights up but there is no user transcript in the chat UI. The character seemingly not replying to what the user is saying.

    The plugin requires microphone access which is sometimes not enabled by default.

    Default Animations Incompatibility

    The default animations that ship with the plugin seems broken. The hands seem to intersect with the body.

    The animation avatar is incompatible with the character mesh.

    Animations have Facial Blendshapes

    The Lip-sync from characters are either not visible or are very faint.

    Some types of animations control facial blendshapes. These animations prevent the lip-sync scripts to properly edit the facial blendshapes.

    Jaw Bone in Avatar is not Free

    The Lip-sync from characters are either not visible or are very faint.

    The animation avatar for the character may be using the Jaw Bone. If we set the mapping free to none, the script will be able to manipulate the jaw bone freely.

    Mac Security Permission Issue

    Security Permission Issues with grpc_csharp_ext.bundle DLL in Unity on MacOS.

    MacOS's security protocols can prevent certain unsigned external DLLs, like grpc_csharp_ext.bundle, from functioning correctly in Unity.

    Microphone Permission Issue with Universal Builds on Intel Macs in Unity

    No Microphone access request pops up

    Incompatibility between Intel and Apple Silicon versions of grpc_csharp_ext.bundle when attempting a Universal build.

    Enabled Assembly Validation

    Unity, by default, checks for exact version numbers for the included assemblies. For our plugin, this is not necessary, since we use the latest libraries.

    Disable Assembly Validation

    Missing NewtonSoft Json

    Disable Assembly Validation
    Missing Newtonsoft Json
    Microphone Permission Issues
    Default Animations Incompatibility
    Animations have Facial Blendshapes
    Jaw Bone in Avatar is not Free
    macOS Permission Issues
    Building for macOS Universal apps
    Convai Developer Forumarrow-up-right
    Assembly 'Assets/Convai/Plugins/Grpc.Core.Api/lib/net45/Grpc.Core.Api.dll' will not be loaded due to errors: 
    Grpc.Core.Api references strong named System.Memory Assembly references: 4.0.1.1 Found in project: 4.0.1.2.
    Missing Newtonsoft Json
    Microphone Permission Issues
    Default Animations Incompatibility
    Animations have Facial Blendshapes
    Jaw Bone in Avatar is not Free
    macOS Permission Issues
    Building for macOS Universal apps
    Assets\Convai\Plugins\GLTFUtility\Scripts\Spec\GLTFPrimitive.cs(8,4): error CS0246: The type or namespace name 'JsonPropertyAttribute' could not be found (are you missing a using directive or an assembly reference?)
    Assets\Convai\Scripts\Utils\HeadMovement.cs (2,30): error CS0234: The type or namespace name 'Rigging' does not exist in the namespace 'UnityEngine.Animations' (are you missing an assembly reference?)

    Add Lip Sync to Your Character

    Learn how to add and configure the Convai Lip Sync component on your character, assign profiles and maps, configure playback and latency settings, and verify real-time facial animation in Unity.

    hashtag
    Introduction

    The Convai Lip Sync component connects real-time speech animation to your character's face. While your character is speaking, it receives incoming Lip Sync data, processes it through the active Lip Sync map, and drives the blendshapes on your character's meshes automatically.

    This page explains how to add the component, what each Inspector section does, and how to configure it correctly for a working Lip Sync setup in Unity.


    hashtag
    Before You Start

    Before adding the component, make sure your setup includes:

    • A character in the scene with at least one SkinnedMeshRenderer that contains facial blendshapes

    • A Convai Character component on the same GameObject


    hashtag
    Add the Component

    Select your character's root GameObject in the Hierarchy, then in the Inspector choose:

    Once added, the component appears with four main sections in the Inspector:

    • Core Setup

    • Playback & Behavior

    • Streaming & Latency

    hashtag
    Core Setup

    This is the main setup section. It defines which Lip Sync profile the character uses, which map is applied, and which meshes will be animated.

    hashtag
    Profile

    The Profile dropdown selects the Lip Sync profile used by the character.

    This tells the system which channel schema to expect for the current setup.

    Available options are:

    Option
    Use when your character is...

    This setting must match the format your character is designed to work with. If the wrong profile is selected, incoming channels will not line up correctly and the face will animate incorrectly.

    For more detail on profile behavior and supported formats, see .

    hashtag
    Mapping

    The Mapping field assigns the ConvaiLipSyncMapAsset used by the component.

    A map connects incoming Lip Sync channels to the actual blendshape names on your character meshes.

    Buttons next to the field:

    Button
    What it does

    If this field is left empty, the component uses the built-in default map for the selected profile. For many standard ARKit, MetaHuman, or CC4 Extended setups, this is enough to get started.

    If your character uses custom blendshape names, create and assign a custom map instead. For that workflow, see .

    hashtag
    Target Meshes

    The Target Meshes list defines which SkinnedMeshRenderer components will receive blendshape animation.

    You can populate this list in three ways:

    • Click + to add a slot manually

    • Drag a SkinnedMeshRenderer into an existing slot

    • Click Auto-Find to search the current GameObject and all children automatically

    After the list is populated, the component shows a summary such as:

    This indicates how many meshes were found and how many total blendshape slots are available across them.

    If this count is 0, there is nothing for the Lip Sync system to animate.

    For most characters, Auto-Find is the fastest way to build this list. After that, remove any meshes that should not be driven, such as clothing or accessories with no facial blendshapes.

    hashtag
    Playback & Behavior

    This section controls how the facial animation feels during playback.

    hashtag
    Lip Smoothing

    Lip Smoothing controls how strongly incoming values are smoothed from frame to frame.

    Range: 0 to 0.9 Default: 0.5

    Behavior:

    • 0: no smoothing, more direct but potentially jittery

    • 0.9: very smooth, but slower to react

    • 0.5: balanced default for most characters

    A higher value makes facial motion feel softer and more stable. A lower value makes the face react more quickly to incoming changes.

    hashtag
    Fade Transition

    Fade Transition controls how long it takes for the face to return to neutral after speech ends.

    Range: 0.05 to 2 seconds Default: 0.2

    Behavior:

    • 0.05: nearly instant return to neutral

    • 0.2: natural default for most humanoid characters

    • 2.0: very slow fade

    This helps avoid abrupt snapping when speech finishes.

    hashtag
    A/V Sync Offset

    A/V Sync Offset shifts Lip Sync playback earlier or later relative to the audio.

    Range: -0.5 to +0.5 seconds Default: 0

    Behavior:

    • Negative values: lips move slightly before audio

    • Positive values: lips move slightly after audio

    • 0: no timing offset

    In most setups, this should remain at 0 unless you consistently notice visual desync during playback.

    hashtag
    Streaming & Latency

    This section controls how incoming Lip Sync data is buffered and played back.

    For most users, the default setting is the right choice.

    hashtag
    Latency Mode

    Latency Mode applies a preset buffering strategy.

    Available modes:

    Mode
    Best for
    Trade-off

    Internal values used by each preset:

    Mode
    Max Buffer
    Min Headroom

    hashtag
    Max Buffered Seconds

    This defines how much animation data can accumulate before playback begins.

    A larger value improves stability on inconsistent connections, but increases visible delay.

    This field is editable only in Custom mode.

    hashtag
    Min Resume Headroom

    If playback runs out of buffered frames and pauses, this determines how much data must build up before playback resumes.

    A higher value makes resume behavior more conservative and stable.

    This field is editable only in Custom mode.

    For most projects, leave Latency Mode on Balanced.

    hashtag
    Live Status

    The Live Status section is read-only and updates during Play mode.

    It gives you a live view of what the Lip Sync component is doing internally, which makes it very useful for debugging.

    hashtag
    Status Indicator

    A colored status label in the Inspector shows the current playback state.

    State
    Color
    Meaning

    The profile badge also confirms which profile is active at runtime.

    hashtag
    Timing Counters

    The component also shows runtime counters such as:

    Counter
    Meaning

    If Headroom frequently drops near zero during testing, consider switching to Network Safe or reviewing network quality.

    hashtag
    Step-by-Step Setup

    Follow this flow to set up Lip Sync on a character from scratch.

    1

    hashtag
    Add the Convai Lip Sync component

    Select your character's root GameObject, then add:

    2

    hashtag
    Common Issues

    Symptom
    Likely cause
    Fix

    hashtag
    Related Pages

    For more detailed setup and customization, continue with:

    hashtag
    Conclusion

    The Convai Lip Sync component is the runtime layer that brings profiles, maps, and character meshes together into a working facial animation setup.

    Once the correct profile is selected, the target meshes are assigned, and the map is valid, Lip Sync playback becomes mostly automatic. From there, playback smoothing, fade timing, and latency settings help you refine how the final result feels in your project.

    circle-info

    Need help? For questions, please visit the .

    Live Status

    Mobile or unstable network conditions

    Higher delay, more stable playback

    Custom

    Advanced manual tuning

    Requires direct control of buffer settings

    6.0 s

    0.25 s

    Custom

    unchanged

    unchanged

    Green

    Lip Sync is actively being applied to the meshes

    Starving

    Red

    Playback has run out of buffered data and is waiting for more

    Fading Out

    Orange

    Speech ended and the face is returning to neutral

    Buffer Size

    Total current buffer size in seconds

    Is Talking

    Whether the character is currently speaking

    hashtag
    Select the correct profile

    In Core Setup > Profile, choose the profile that matches your character:

    • ARKit

    • MetaHuman

    • CC4 Extended

    If you are unsure why this matters, review .

    3

    hashtag
    Assign target meshes

    Under Target Meshes, click Auto-Find.

    Make sure the component reports a non-zero number of meshes and blendshapes. If some meshes should not receive Lip Sync animation, remove them manually.

    4

    hashtag
    Check or assign a map

    If your character already uses standard blendshape names for the selected profile, you can leave Mapping empty and use the built-in default map or you can choose a map from provided maps.

    If your character uses different blendshape names, create a custom map and assign it here.

    For that process, see .

    5

    hashtag
    Run the Validator

    Click Validator next to the Mapping field.

    This checks how well the active map matches the assigned meshes and helps identify unmapped or mismatched channels.

    A high coverage result, especially on mouth-related channels, is a strong indicator that the setup is correct.

    6

    hashtag
    Choose a latency mode

    Under Streaming & Latency, keep Latency Mode on Balanced unless you already know you need a lower-latency or more network-safe configuration.

    7

    hashtag
    Enter Play Mode and test

    Start Play Mode and trigger a speech event from your Convai character.

    Watch the Live Status section. In a working setup, the status typically moves through:

    At the same time, your character's face should animate in sync with the voice.

    If the status never leaves Idle, check that the Convai Character component is on the same GameObject and fully configured.

    Incorrect profile selected

    Select the profile that matches the character rig

    Some blendshapes do not animate

    Incomplete map coverage

    Run Validator, fix unmapped entries, or use a custom map

    Animation feels too strong

    Map multiplier is too high

    Lower the map multiplier or reduce specific entry values

    Animation feels too weak

    Map multiplier is too low

    Increase the map multiplier

    Lips move before the audio

    Offset needs earlier correction

    Use a small positive or negative adjustment and retest carefully

    Lips move after the audio

    Offset needs later correction

    Use a small positive or negative adjustment and retest carefully

    Component disables itself during Play

    Validation or setup failure

    Check the Console for errors related to profile, character setup, or required references

    ARKit

    A standard Unity character or any rig with ARKit-compatible blendshape names

    MetaHuman

    An Unreal Engine MetaHuman brought into Unity

    CC4 Extended

    A character built with Reallusion Character Creator 4

    Create New

    Creates a new empty ConvaiLipSyncMapAsset and assigns it immediately

    Edit

    Opens the assigned map asset in the Inspector

    Validator

    Checks the active map against the assigned meshes and reports mapping coverage issues

    Ultra Low Latency

    Very stable low-latency environments

    Lower delay, higher risk of stutter

    Balanced

    Most production use cases

    Best balance of stability and responsiveness

    Ultra Low Latency

    1.0 s

    0.05 s

    Balanced

    3.0 s

    0.12 s

    Idle

    Green

    No speech data is being received

    Buffering

    Yellow

    Data is arriving and buffering before playback

    Elapsed Time

    Time since the current speech event started

    Remaining

    Seconds of buffered animation left

    Received Data

    Total Lip Sync data received for the current event

    Headroom

    Status stays Idle

    Convai Character component missing or not connected

    Make sure the Convai Character component exists on the same GameObject

    Headroom is frequently red

    Network jitter or insufficient buffering

    Switch Latency Mode to Network Safe

    Lip Sync Profiles and Mappings
    Creating a Custom Map
    Lip Sync Profiles and Mappings
    Creating a Profile
    Creating a Custom Map
    Convai Developer Forumarrow-up-right

    Network Safe

    Network Safe

    Playing

    Safety gap between playback and the end of the buffer

    Wrong facial shapes move

    Idle -> Buffering -> Playing
    Add Component > Convai > Lip Sync > Convai Lip Sync
    ✓ 10 Meshes Found (294 Blendshapes)
    Convai > Lip Sync > Convai Lip Sync
    Lip Sync Profiles and Mappings
    Creating a Custom Map