Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
This document explains how to create MetaHuman Characters with Convai Plugin.
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
This stage explains how to add simple custom actions with Convai characters.
Loading...
Loading...
Loading...
Loading...
This guide is for customization of the the blueprints.
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
This documentation describes how to test micrphone in Ureal Engine.
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Convai Unreal Engine Plugin supports the following platforms.
Windows
4.26, 4.27, 5.0, 5.1, 5.2, 5.3, 5.4, 5.5
MacOS
5.0, 5.1, 5.2, 5.3, 5.4, 5.5
Android
4.27, 5.0, 5.1, 5.2, 5.3, 5.4, 5.5
Linux
5.0, 5.1, 5.2, 5.3, 5.4, 5.5
iOS
Coming Soon
Pixel Streaming
5.0, 5.1, 5.2, 5.3, 5.4
A2F Lip-sync (Let's Talk)
5.0, 5.1, 5.2, 5.3, 5.4, 5.5
This document explains how to add MetaHuman to your project.
Install Convai plugin for Unreal Engine. Follow step-by-step instructions for Visual Studio and XCode setup.
Download Visual Studio from here.
Ensure having the required C++ toolchains mentioned here. If you already have Visual Studio installed, you may open the installer and select 'Modify' to add the above mentioned toolchains.
Download XCode from the app store
For UE 5.3 and UE 5.0, follow this guide to enable microphone permissions.
There are two methods to install the plugin depending on your requirements
Directly from the Marketplace Link: Recommended for easy installation and tracking updates to the plugin. (UE 5.1 - 5.3)
Building from Github Source: For source UE builds and for unsupported UE versions on the marketplace, but not guranteed as the marketplace approach. (UE 4.26 - 5.0)
From the top toolbar, go to Edit > Plugins.
Find the Convai plugin.
Click the checkbox to enable the plugin.
Restart Unreal Engine.
Go to Edit > Project Settings.
Choose Convai under the Plugins section on the left bar.
Paste the API key into the API Key field.
Set up the Convai Unreal Engine plugin and add conversational AI to your apps.
Get started with sample projects for some of our popular tutorials:
Transmit real-time environmental data to characters without requiring a response. Supported from Plugin Version 3.5.1.
Dynamic Environment Info is a powerful feature that allows users to pass additional environmental data to characters without direct interaction. This enables more immersive and creative gameplay scenarios by enhancing how characters perceive their surroundings.
For example:
The character can understand the time of day (e.g., "time of day is night").
The character can access inventory details (e.g., "You currently have a gun and a healing potion").
The feature supports structured data formats for richer information exchange.
Follow the steps below to integrate Dynamic Environment Info into your project.
Open the Character Blueprint in your project.
In the Begin Play event, locate the ConvaiChatbot
component.
Set the Dynamic Environment Info
variable with a string value of your choice.
Example 1 (Simple):
Example 2 (Structured Format):
Save and Play
Save your blueprint changes.
Hit Play to test the interaction.
Observe how the character dynamically responds based on the information passed.
By following these simple steps, you can unlock more engaging gameplay mechanics and enrich the interactions.
Add lip sync to MetaHuman characters in Unreal Engine with Convai. Enhance realism and engagement.
Prior to advancing, ensure that you modify the parent class of your MetaHuman to ConvaiBaseCharacter, as indicated in the provided .
Open your MetaHuman blueprint.
Navigate to the Components
section and select the Add
button.
Search for Convai Face Sync
and select.
Finally, LipSync is added to your MetaHuman. Compile and save it and give it a try.
Change AI Character Movement Speed - Unreal Engine Guide with Convai.
Open your AI character blueprint.
Select the Floating Pawn Movement
component from the components list.
Set the Max Speed
field under the details panel to your required speed.
This gives the player the ability for conversation with the chat bot.
This can be applied to first person player (FPP) or third person player (TPP).
Steps to change parent class of a First person player (FPP): -
Create a new project as a first person project or import into your already existing project.
Steps to import : - Content Browser > Add > Add feature or content pack to the project > First Person > Add to Project.
Steps to make your game a default First Person game: -
Steps: - Edit > Project Settings > Maps and modes > Default Mode > Default GameMode > BP_FirstPersonGameMode
.
Then go to All > Content > FirstPerson > Blueprints > BP_FirstPersonCharacter
.
Click Class Settings then in the Details section Under ‘Class Options’
change the parent class to ‘ConvaiBasePlayer’.
Hit save and compile and you will be good to go.
For Third Person Player follow the same steps just by looking for Third Person
This enables AI conversational features to MetaHuman
Open content browser then Content > MetaHumans > MetaHuman blueprint
.
Go to Class Settings and then under Details panel > Class Options > Parent Class and set it to ConvaiBaseCharacter
.
To add animation to the body go to the 'body'
component and under the details > Animation > Anim Class
change it to ‘Convai_MetaHuman_BodyAnim’.
Similarly for the face go to the face component and under the details > Animation > Anim Class
change it to ‘Convai_MetaHuman_FaceAnim’.
Compile and you will be good to go.
Setting Up
Setup your project with Convai plugin.
Creating MetaHuman Characters
MetaHuman characters with lip sync
Creating Reallusion Characters
Reallusion characters with lip sync
Blueprint Reference
Detailed documentation for every blueprint function or component.
Chanegelog
Track changes to the Unreal SDK.
Github Repository
Access the source code for the SDK.
Tutorial Playlist
Implement basic conversation either written or with voice.
This document explains how to create Reallusion Characters with Convai Plugin.
This document explains how to create ReadyPlayerMe Characters with Convai Plugin.
This mini-guide provides instructions for setting up Convai with pixel streaming in Unreal Engine
To setup the Pixel Streaming server, we recommend taking a look over this excellent guide.
Ensure you have the latest Convai 3.1.0 plugin or later.
Enable Unreal Engine's Pixel Streaming
and the Pixel Streaming Player
Plugins from the Plugins window.
In the player blueprint which has the Convai Player component, add the PixelStreamingAudio component to the list of components.
Click on PixelStreamingAudio component, and in the details panel find Base Submix and choose AudioInput sound submix.
On Begin Play in the event graph, add the following blueprint function to initialize Pixel Streaming with the Player Component.
Now, pixel streaming mic input should be working. However, system microphone will no longer work. To change back and forth to and from system microphone, set Use Pixel Streaming Mic Input to true for enabling pixel streaming microphone and false to enable system microphone. This is found in the ConvaiPlayer component.
Adds lip animation to MetaHuman
Download the plugin from this link.
Head to your project folder and find the ‘Plugins’
folder. If not, create a new folder named ‘Plugins’.
Unzip the downloaded file and copy the folder named ‘ConvaiOVRLipSync’
into the ‘Plugins’
folder and restart the Unreal Engine.
Open the MetaHuman Blueprint and add a component named ‘ConvaiOVRLipSync’.
Make sure that the ‘Face’
component is using the ‘Convai_MetaHuman_FaceAnim’
animation class.
Link for Youtube Tutorial.
Create a new blueprint and select ConvaiRPM_Character
as the parent class which you can find under All Classes
.
Drag the blueprint to the scene and the default ReadyPlayerMe character should appear.
Create a new character on Convai Playground and copy the Character ID, you can also edit the avatar by clicking on the Edit Avatar
icon on the top right of the avatar preview window.
Back to Unreal, click the character in the scene and under the details panel find the Char ID field and paste the copied character ID into it.
Hit Play
, and wait for a few seconds then the character should now load into the game.
The first time the character loads will take more time but after that it will be cached and loaded faster.
This gives the player the ability for conversation with the chat bot.
This can be applied to first person player (FPP) or third person player (TPP).
Steps to change parent class of a First person player (FPP): -
Create a new project as a first person project or import into your already existing project.
Steps to import : - Content Browser > Add > Add feature or content pack to the project > First Person > Add to Project.
Steps to make your game a default First Person game: -
Steps: - Edit > Project Settings > Maps and modes > Default Mode > Default GameMode > BP_FirstPersonGameMode
.
Then go to All > Content > FirstPerson > Blueprints > BP_FirstPersonCharacter
.
Click Class Settings then in the Details section Under ‘Class Options’
change the parent class to ‘ConvaiBasePlayer’.
Hit save and compile and you will be good to go.
For Third Person Player follow the same steps just by looking for Third Person
Adjust Interaction Radius - Unreal Engine Guide for Convai integration.
Go to your player blueprint.
Click on the gear icon and select Show Inherited Variables
under My Blueprint
tab.
Search for the variable MaxChatDistance
.
Set it to a number of your choice. Note: Set it to 0 for infinite distance.
This document explains how a Reallusion Character can be imported and used with the Convai Plugin.
Create / open a project in Unreal Engine and download the Reallusion auto setup for Unreal Engine from this .
Download > install > Open a folder based on your Unreal Engine version.
Copy the Content and Plugins folder and paste them in your Unreal Engine project folder.
Restart your project and create a new folder (say ‘Kevin’) in your content browser.
Copy the .Fbx
file (named as the character) from the export to ‘Kevin’
in the content browser and the FBX Import Options menu will pop up.
Check the following options and click Import All :-
Use T0 As Ref Pose.
Import Morph Targets.
Create a new folder named Animations within the Kevin folder and import all the animation file (named ‘_motion’ at the end) from the exported file from Reallusion.
The FBX Import Options pop up again and Uncheck the Import Mesh and select your imported skeleton.
Under Animation > advanced
. Check the Use Default Sample Rate.
Click Import All.
Now install the Convai Plugin from Epic marketplace and restart the engine.
Edit > Plugins
.
Search for ‘Convai’
and enable it by clicking the checkbox and restart.
Create a new Blueprint Class within Kevin folder and under ALL CLASSES select ‘ConvaiBaseCharacter’.
Open up the blueprint. Components > add > Skeletal Mesh.
Under the details tab go to mesh > Skeletal Mesh
and select the imported mesh named Kevin form Reallusion.
Finally add the Character Id by selecting the Character (from Convai website) you just added and you can enjoy talking to your AI buddy.
To change the parent class of the player refer to section.
Setting Up
Setup your project with Convai plugin.
Creating Metahuman Characters
Metahuman characters with lip sync
Creating Reallusion Characters
Reallusion characters with lip sync
This document explains how to add the Convai ReadyPlayerMe plugin to your project.
Go to this Drive Link.
Download the version corresponding to your Unreal Engine.
In your project directory create a folder named Plugins
if it does not already exist.
Extract the contents of the downloaded zip in the Plugins folder, the final folder hierarchy should look like this:
This document explains how to create a Reallusion character with Character Creator 4.
Steps to create a Reallusion Character using Character Creator 4: -
Use the Character Creation tool by Reallusion.
Create / use default a character present there and add animations of your choice.
(Here we have used an already existing character named ‘CC4 Kevin’ and have added the idle animation and walking animation.
Link for Youtube Tutorial.)
Export it in FBX
format.
File > Export > FBX > Clothed Character
Keep the following settings in Export FBX.
Target Tool Preset: - Unreal
FBX Options: - Mesh and Motion
Max Texture Size: - 4096 ( Choose the maximum available)
Frame Rate: - 30
Check the custom section. Click the Load Perform button.
Uncheck First Frame in bind pose.
Check Export Mesh and Motion individually.
Check Save One Motion per File.
Checking the Delete Hidden Faces option may avoid rendering issues.
Click Export.
Use Convai's Narrative Design Triggers in Unreal Engine to enhance your game stories.
Before proceeding with this section, it is advisable to familiarize yourself with the Narrative Design system, as elucidated in this tutorial.
Develop the logical flow for your specific use case. In this instance, we have created a simple museum tour guide scenario.
Once the logic is decided we can move to Unreal Engine. (In this guide we will use the same setup described in this guide)
Our goal is to invoke the trigger Start Tour
in the narrative Design graph, using the Invoke Narrative Design Trigger
.
The Trigger Name
in the function should be the same as the Trigger
name on the graph.
The above example showcased only one Trigger. The use of more than one Trigger is also possible based on execution logic.
Narrative Design - Enhance your Unreal Engine projects with Convai.
Narrative Design can be extended using Unreal Engine Blueprints, allowing you to invoke triggers created in the Narrative Graph, as well as access dynamic variables such as the time of day or relevant information like inventory contents in the game.
This document explains how to bind objects to the Reallusion character and perform action with that object.
Go to Window > Place Actors >
Search NavMeshBoundsVolume
and drag it in the scene.
Click the character in the scene (in this case click the Reallusion character) and head to the Details
Panel.
Details > Default > Objects > Click Add Element
Now we need to select a reference for the object from the scene. Click on Pick Actor from Scene
.
Then select any object from the scene. We can also provide a name
and description
to the object in the scene. This will allow the player to interact with the objects.
Save and then hit play to test the character to perform certain actions related to the object present which you just added.
This document describes how actions can be used/added to your Convai character, we will go about actions in multiple stages. This written guide is complementary the following tutorial video:
This document explains how to detect words and perform certain operations based on it.
Open your Convai Character blueprint and click on Class Settings
and then on ConvaiChatbot
component.
Under the Details
section scroll down to the Events
section and add On Transcription Received
event.
Once we have the transcription of player input, we can perform a substring search on it.
The Print String at the end is just an example. You can add your logic after the substring match.
This action enables a character to move to certain objects/characters present in its environment
By default, Convai Characters possess knowledge about the presence of other Convai characters within their environment.
Let's assume there are two Convai characters present in the virtual environment. We can ask one of the characters to move to the other character as shown here.
Steps: -
Click on your Convai character in present in scene.
In Outliner
go to Convai Info
section and then Click on the +
icon near Objects
.
Click on Pick Actor from Scene
tool
Select any object from the scene and give it a Name
and Description
of choice.
Save the changes and ask your Convai character to move to the object your named.
The Convai character finally move to the object you added to its object list.
This document explains how to make characters respond to events happening in their surroundings with a small example.
Our goal in this example is to have the character welcome the player whenever the player enters a certain area, this can be done by using the Invoke Speech
node that basically invokes the AI character to talk and a simple collision box.
Open your AI character blueprint and select the Viewport
tab.
Note: the character blueprint can be a MetaHuman, ReadyPlayerMe, Reallusion or even a custom one you have created, just ensure that it has the Convai Chatbot
component.
From the Components
list add a Box Collision
.
Switch back to the Event Graph
tab.
Select the Box Collision
you just added and scroll down in the Details panel
. Under Events
add the On Component Begin Overlap
event to your event graph.
Setup the following blueprint schematic which uses the Invoke Speech
node from the Convai Chatbot
component.
Enter a Trigger Message
that expresses what happened (i.e. "Player Approached") and you can add a simple instruction (i.e. Greet the player).
Setting the In Generate Actions
and In Voice Response
boolean to true will let the Convai Characters perform actions and generate audio responses respectively.
Hit Compile
and Save
then run the program.
On approaching a certain vicinity will trigger the event and the Convai Character will greet the character as mentioned in the Trigger Message.
The above example is just a simple use case. However, the use of Invoke Speech opens new doors to limitless use cases.
This stage explains how to use the default actions which come implemented out of the box with Convai.
These actions include:
Moves To: Character can move to another character or object
Follows: Character can follow you or follow other objects/characters
Waits For: Character can wait for some time before doing another action, for example: Wait for 10 seconds then throw a grenade
Actions can have parameters, for example: Picks Up
is an action that expects an object that will be picked up, in this guide we will see how to parse these parameters as well as other types of parameters such as text.
Referencing objects or characters in your AI character is very important for two goals: First, it will have knowledge that the object/character exists, and second, you will be able to get a reference to these objects or characters when actions that relate to them are triggered, such as Move to - Pick up - Follow
will always be triggered with objects or characters
Referencing objects requires that you add the object to the Environment object that is inside the ConvaiChatbot component. There are two ways to do so:
The first method is to select the character in the scene and then under the details panel, add the references to the Objects array under Convai Info
, this method requires that you inherit from the ConvaiBaseCharacter
blueprint.
At the begin play
event on the character blueprint, loop over all the objects that you want the character to know about then add to the environment object, in the following blueprint, we tagged the objects we want to add to the environment to make it easier to fetch those at begin play, then we used the second and third tags for names and descriptions respectively.
Adding, editing or removing objects at runtime is possible.
Use Add Object
function for objects and Add Character
for Characters.
Adding the player and other characters in the environment is handled if you're using CovnaiBaseCharacter as the parent class.
Once you have managed to add your objects and characters to the environment, let's go over a quick scenario to see how we handle the response:
Create an action named Looks At
that forces the AI character to look towards a certain object or character.
Create an event with the same name in the character blueprint, and add an input parameter of type Convai Result Action, this will contain the parameters required for the action which is in our case the object/character to look at.
Now let's finish the implementation as follows, we will break the Action Parameter
structure and then set the The Related Object Or Character
as the Main Character
, note that this is a quick trick to get the AI character to look at the referenced object or character.
Break the Related Object Or Character
structure to get more details about the object or character such as its reference, name and description.
If you have not added the reference for the object you will get an invalid reference but you will still get the name and description.
To be continued..
This action enables Convai character to follow characters/objects/players present in its environment
Approach a Convai character and ask it to follow other Convai characters or a Convai player.
Then you start moving around and you will find that the Convai Character starts following you.
Ask the character to follow the added character/object present in the Environment.
Even if we change the position of the object (cube here) the Convai Character would still follow the cube.
Add a reference to the object or character as shown in this
This document explains how to add simple actions to your Convai characters.
Steps to add simple actions to your Convai character:-
Select your Convai Character and navigate to the Details panel. Within this interface, locate the Convai Info
section.
Select the Add Element
icon (+)
and input the desired action you wish to execute. For example, Print.
Open the character blueprint to which you have just now added the action.
Add a new event with the same name. Print
in this case and define the logic for the function you just named and run the function Handle Actions Completion
with Is Successful
set to true.
Hit compile and ask your Convai Character to perform the action.
Add Descriptions to Actions - Custom Actions Guide for Unreal Engine.
This document details the process for adding descriptions to actions within Convai characters, improving AI understanding and action execution. Action descriptions provide contextual information, aiding in more accurate and contextually appropriate behaviors.
Action descriptions offer contextual hints to the AI, helping determine when and how to trigger actions based on gameplay and conversation scenarios, enhancing the gaming experience through more accurate AI responses.
Identify the Action: Determine the action needing a description, such as "Sit Down" or "Crouch".
Format for Adding Descriptions: Use the format Action Name <Description>
to add descriptions. The description should be clear and concise, providing exact indications for the AI.
Examples:
Sit Down:
Crouch:
Attack Enemy:
Send Email:
Navigate to Convai Info Section: Select your Convai character in the editor and access the Convai Info section.
Adding Action with Description: Click the Add Element (+) icon to input the action along with its description as previously formatted.
Clear and Specific Descriptions: Ensure descriptions are straightforward, precisely indicating action triggering conditions.
Use English.
In this page we will show a simple example of how to record and transcribe player voice.
In your player blueprint, make sure you have the Convai Player Component
already included. Otherwise, you can add it manually or by following this guide.
Apply the following blueprint setup in the player blueprint.
Hit play to test, press and hold space to talk and release to print a transcription of your input voice.
This guide explains how to enable Convai conversation and actions in multiplayer
Please see the instructions below or check out our latest tutorial video on YouTube.
Navigate to your player blueprint.
Under My Blueprints
tab, click the gear icon and select Show Inherited Variables
.
Find and enable the two boolean variables: - Enable Multiplayer - Enable Voice Chat
This guide explains how to enable long term memory in Convai Character
Select your character in the playground on the Convai website.
Go to the Memories Tab, and then to Memory Settings. Enable Long Term Memory for the character there.
Go to, Project Settings -> Plugins -> Convai. where you have set the API key.
There, click on the Manage Speaker ID. button under Long Term Memory. This will spawn an editor utility to create, delete, and list speaker ids.
If you have already created a Speaker ID, you can click List All Speaker IDs. Then, go to Project Settings -> Plugins -> Convai, and under Long Term Memory, you will find the Speaker IDs array. Copy the desired Speaker ID.
Now, go to the Player Blueprint and select the ConvaiPlayer component. In the details panel, you will find a Speaker ID field under the Convai category. Past the desired Speaker ID there.
This guide explains how to add a 3D chat widget to Convai Character
Go to the following and download the Convai Convenience Pack.
Unzip the file and place it in ProjectDirectory/Content
folder.
Navigate to your player blueprint.
Go to the Viewport tab.
Under the Components tab, click the Add button, search for BP Convai 3DWidget Component, and add it.
Select the newly added component and adjust the transform according to your needs.
Blueprints Reference - Comprehensive guide for Convai Unreal Engine integration.
This document explains how to alter the character response audio rate.
Open your character blueprint which contains the Convai Chatbot
component.
Click Class Settings
and then Convai Chatbot
component.
On the right-hand side Details
panel under the Sound
section modify the Pitch Multiplier
.
A value greater than 1 will ramp up the speed and vice-versa.
The Convai Environment class is used to define what actions are available for the character and what are the objects and other characters in the scene.
Convai Environment is used as input to the StartTalking()
or SendText()
functions in the Convai Player component, and it allows the character to generate actions.
A Convai Environment object must have a Main Character set to be considered valid.
CreateConvaiEnvironment()
UConvaiEnvironment*
Creates a Convai Environment object.
SetMainCharacter(FConvaiObjectEntry InMainCharacter)
void
Assigns the main character initiating the conversation, typically the player character, unless the dialogue involves non-player characters talking to each other.
AddAction(FString Action)
void
Adds an action to the Environment object
AddActions(TArray ActionsToAdd)
void
Adds an array of actions to the Environment object
RemoveAction(FString Action)
void
Remove an action from the environment object.
RemoveActions(TArray ActionsToRemove)
void
Removes an array of actions from the Environment object
ClearAllActions()
void
Remove all actions from the Environment object.
AddObject(FConvaiObjectEntry Object)
void
Adds an object to the Environment object
AddObjects(TArray ObjectsToAdd)
void
Adds an array of objects to the Environment object
RemoveObject(FString ObjectName)
void
Remove an object from the environment object.
RemoveObjects(TArray ObjectNamesToR
emove)
void
Removes an array of objects from the Environment object
ClearObjects()
void
Remove all objects from the Environment object.
AddCharacter(FConvaiObjectEntry Character)
void
Adds a character to the Environment object
AddCharacters(TArray CharactersToAdd)
void
Adds an array of characters to the Environment object
RemoveCharacter(FString CharacterName)
void
Remove a character from the environment object.
RemoveCharacters(TArray CharacterNamesToRemove)
void
Removes an array of characters from the Environment object
ClearCharacters()
void
Remove all characters from the Environment object.
This guide shows how to dynamically pass variables to Narrative Design section and triggers
We will create a simple scenario where the character welcomes the player and asks them about their evening or morning based on the player's time of day.
In the playground, enable Narrative Design on your character and change the starting section name to Welcome
.
Add the following to the Objective field of the Welcome section:
The time of day currently is {TimeOfDay}. Welcome the player and ask him how his {TimeOfDay} is going.
Notice that by adding any string between curly brackets it becomes a variable, and what we did here is adding the time of day as a variable, then from Unreal we can pass either the word "Morning" or "Evening" and the character will respond accordingly.
Back in Unreal, open the character's blueprint.
Set the Narrative Template Keys
variable with a map containing the same variable name TimeOfDay
and for demonstration purposes we will hard code the value to "Morning".
Start the play mode and try it out.
Feel free to try other scenarios and settings to align better with your usecase.
You can use the narrative design keys feature in both sections and triggers.
Make sure the variable names are between curly brackets and has no spaces in between.
You can dynamically set, change or clear the narrative keys in Unreal blueprints.
Actor component for the player
Convai Player is an Actor component responsible for capturing microphone audio and streaming it to a Convai character.
For voice chat to work in multiplayer, this component must be added to the player's possessed object and not the PlayerController.
Is Recording()
Boolean
Returns True
if microphone audio is recorded, False
otherwise.
Is Talking()
Boolean
Returns True is microphone audio is been streamed, false otherwise.
Start Recording()
Void
Start recording audio from microphone, use "Finish Recording" function afterwards.
Finish Recording()
USoundWave*
Stops recording from the microphone and outputs the recorded audio from microphone.
Send Text(
UConvaiChatbotComponent* ConvaiChatbotComponent,
FString Text,
UConvaiEnvironment* Environment, bool GenerateActions,
bool VoiceResponse,
bool RunOnServer,
bool UseServerAPI_Key
)
Void
Sends text to character.
Start Talking(
UConvaiChatbotComponent* ConvaiChatbotComponent, UConvaiEnvironment* Environment, bool GenerateActions,
bool VoiceResponse,
bool RunOnServer,
bool StreamPlayerMic,
bool UseServerAPI_Key
)
Void
Starts streaming microphone audio to the character. Use "Finish Talking" afterwards to let the character know that you are doing talking.
Finish Talking()
Void
Stops streaming microphone audio to the character.
Get Available Capture Device Names()
TArray<FString>
Returns all the available capture devices.
Get Microphone Volume Multiplier()
Void
Gets microphone volume multiplier.
Set Capture Device Name()
Boolean
Sets the capture device name.
Set Microphone Volume Multiplier()
Void
Sets the microphone volume multiplier.
Get Active Capture Device()
Void
Gets info about active capture device.
The character to talk to.
Holds all relevant objects and characters in the scene including the (Player), and also all the actions doable by the character. Use the CreateConvaiEnvironment()
function to create and then use functions like AddAction(), AddCharacter(), AddMainCharacter()
to fill it up.
FString Text
Text to be sent to the character.
bool GenerateActions
Whether or not to generate actions (Environment has to be given and valid)
bool VoiceResponse
If true it will generate a voice response, otherwise, it will only generate a text response.
bool RunOnServer
If true it will run this function on the server, this can be used in multiplayer sessions to allow other players to hear the character's voice response.
bool StreamPlayerMic
If true it will stream the player's voice to other players in the multiplayer session, this is the same effect as voice chat.
This page gives an overview of how to use set the Main character the AI is talking to and the Object In Attention
Navigate to your AI character blueprint.
Drag the Convai Chatbot
component into the event graph.
Get the Environment
Object and make sure it is valid.
Use the function Set Main Character
to define the current speaker to the AI character. Additionally, if you're using Convai's animation blueprints then setting the Main Character would cause the AI character to look at the Main Character reference.
Use the function Set Attention Object
to set which object is currently being talked about, this function also automatically adds the input object to the list of already existing objects in the environment.
The character blueprint can be a MetaHuman, ReadyPlayerMe, Reallusion or even a custom one you have created, just ensure that it has the Convai Chatbot
component.
* ConvaiChatbotComponent
* Environment
This section focuses on troubleshooting Convai Unreal Engine plugin problems.
Convai Result Action - Blueprint Reference for Unreal Engine integration.
Speech-to-Text - Blueprint Reference for Convai Unreal Engine integration.
Description: Transcribes the provided audio.
Inputs:
Sound Wave: The recorded output from the microphone, please take a look over the tutorial/sample project for an example on how to use Convai voice capture component with the API.
Outputs:
Response: Contains the text output of the audio file if the API was successful, otherwise might contain information on why the API call failed.
Nothing is returned, check logs for details on why it failed.
Description: Transcribes the provided audio file.
Inputs:
Filename: The path to the recorded audio file on your local disk, the file should be in a .wav format.
Outputs:
Response: Contains the text output of the audio file if the API was successful, otherwise might contain information on why the API call failed.
Nothing is returned, check logs for details on why it failed.
Utility Functions - Blueprint Reference for Convai Unreal Engine integration.
Description: Create a new character and get the character ID for it.
Inputs:
Char Name: Name of the character.
Voice: Voice name [MALE/FEMALE].
Backstory: Backstory for the new character.
Outputs:
Char ID: character id for the new character.
Nothing is returned, check logs for details on why it failed.
Description: Get a list of character IDs belonging to the user.
Outputs:
Char IDs: list of characters.
Nothing is returned, check logs for details on why it failed.
Description: Fetch all details of a character including backstory, voice, etc.
Inputs:
Char ID: Character ID for which to fetch all the details.
Outputs:
Character Name: Name of your character.
Voice Type: Voice name.
Backstory: Character backstory.
Has Ready Player Me Link: True if the avatar is configured on the website.
Ready Player Me Link: The avatar link to be used to download.
Nothing is returned, check logs for details on why it failed.
Description: Update a particular character.
Inputs:
Char ID: Character ID to be updated.
New Voice: Voice name or [MALE/FEMALE].
New Backstory: Updated backstory.
New Char Name: Name of the character.
For the list of supported voices please refer to the table in Text To Speech API.
To update a subset of properties, such as Voice and Name only, leave the other fields empty and the update will only affect the specified fields with values.
Outputs:
Nothing is returned.
â›” [On Failure]
Nothing is returned, check logs for details on why it failed.
Conflicts with Convai arise when packaging a project containing the MetaHuman plugin in the editor.
Primary reason for most of the Unreal Engine packaging issue with Convai Unreal Engine SDK.
Numerous packaging issues stem from the absence of the C++ toolchain required by Unreal Engine. So it's recommended to check the required toolset before packaging your project.
Refer to this guide for configuring Microsoft Visual Studio or adjusting the toolsets as per the recommended guidelines by Unreal Engine.
The ConvaiObjectEntry structure parses information about a character or an object in the scene, which is then included in the Environment object.
Convai Object Entry structure allows for parsing of information about an object or character in the scene. It takes in the name of the object or character and can also include a reference to the object or character, a description, and a position vector (optional). This information is then added to the Environment object.
Ref
FString
A reference to a character or object. (Optional)
Optional Position Vector
FVector
A related position vector. (Optional)
Name
FString
The Name of the character or object.
Description
FString
The bio/description for the character/object. (Optional)
Actor component for the AI character
Convai Chatbot is an Actor component responsible for processing and getting a response for the voice audio coming from the Convai Player component. It plays the audio response and has a variety of useful events for transcription, actions, text response, and others.
Character Id
FString
The character ID you would like to assign to the component.
Interrupt Voice Fade Out Duration
Float
Time in seconds to gradually fade out voice response when interrupted until it is stopped.
Language Code
FString
Read the value of Language Code.
Ready Player Me Link
FString
Read the value of Ready Player Me Link.
Session Id
FString
To track memory of a previous conversation, set to -1 means no previous conversation, (this property will change as you talk to the character) you can save the session ID for a conversation and then set it back later on to resume a conversation.
Voice Type
FString
Read the value of the Variable VoiceType.
Environment
UConvaiEnvironment
Contains all relevant objects and characters in the scene including the (Player), and also all the actions doable by the character.
Avatar Image Link
FString
Read the value of Avatoar Image Link.
Character Name
FString
Read the value of variable CharacterName.
Backstory
FString
Read the value of variable Backstory.
Is in Conversation()
Boolean
Returns True, if the character is talked to, is talking, or is processing the response.
Is Talking( )
Boolean
Returns True, if the character is currently talking.
Is Thinking()
Boolean
Returns True, if the character is still processing and has not received the full response yet.
Is Listening ()
Boolean
Returns True, if the character is currently listening to the player.
Supports Lip Sync()
Boolean
Returns True
, if LipSync component is available and attached to the character.
Get Viseme Names()
Array of Strings
Returns list of viseme names.
Get Visemes()
Array of Float
Returns last predicted viseme scores.
Interrupt Speech()
Void
Interrupts the current speech with a provided fade out duration.
Reset Conversation()
Void
Reset the conversation with the character and remove previous memory. This is the same as setting the session ID property to -1.
On Actions Received
Called when new actions are received from the character.
On Text Received
Called when new text is received from the character.
On Transcription Received
Called when new transcription is available.
On Started Talking
Called when the character starts talking.
On Finished Talking
Called when the character stops talking.
On Failure
Called when there is an error.
Bot Text
The received text
Audio Duration
Duration of spoken received text. Is equal to zero if only text was received and no audio.
Is Final
True if this is the last chunk of text/transcription to be received.
Transcription
The transcription received.
Is Transcription Ready
True if the received transcription is ready and is not going to change anymore, false if the transcription is still in an intermediate state and is going to change.
Description: Creates audio for the corresponding text.
Inputs:
Transcript: The transcript for the audio.
Voice: The voice name.
For the list of supported voices please refer to the table in Text To Speech API.
Outputs:
Wave: Audio response.
Nothing is returned, check logs for details on why it failed.
When running the Convai plugin on a machine that is missing certain popular root certificates you will receive errors when trying to fetch characters' data and you will usually see this error in the chat widget Load Failed for Character ID: <Character ID>
.
To fix the issue, you need to disable SSL verification from Unreal Engine, this can be done by going to Edit menu, select Project Settings and search for Verify Peer
and disable it.
Another solution is to install any browser like Google Chrome or Fire Fox which usually install the required certificates automatically.
This problem arises when ConvaiOVRLipSync is not added to the project.
Convert your Blueprint project to a C++ project. Steps to convert: -
Navigate to the Tools menu, and choose 'New C++ class...' from the provided options.
Select any parent class from the menu that appears. For simplicity, we opt for 'None' in this case. Proceed by clicking on 'Next'.
Provide any name and proceed by selecting 'Create Class'.
Once done, close the Unreal Engine editor and then start it again by clicking on build project in Microsoft Visual Studio code.
Copy the ConvaiOVRLipSync plugin to the Plugins directory within the engine.
Refer to this to add CovnaiOVRLipSync.
This document elucidates the procedures for testing your microphone using the settings widget that comes with the Convai plugin.
Follow the steps outlined below to conduct a microphone test: -
The following steps work when you have set the parent class of the player blueprint to ConvaiBasePlayer.
Press the F10
key on your keyboard after you play your project.
A window will open up as shown here.
From Input
select your microphone and click on Record
. This will start recording your audio.
Record your audio for a few seconds and click on Stop
.
If your microphone is working properly you must hear your audio loud and crisp.
If the audio sounds to be low then adjust the Audio Gain
accordingly.
The Convai plugin requires microphone access in Unreal Engine (UE) 5.0 and 5.3 on macOS.
Temporarily Disable SIP - Follow the instructions provided in the guide - Note: make sure to run the command csrutil disable
in the terminal.
Clone the tccutil Repository - Open the Terminal, and enter the following command to clone the repository:
git clone https://github.com/DocSystem/tccutil
Navigate to the tccutil Directory by entering cd tccutil
into the terminal.
Ensure on your system. Run the tccutil command In the Terminal to allow microphone access for Unreal Engine:
sudo python3 tccutil.py -e -id com.epicgames.UnrealEditor --microphone --enable
Re-enable SIP Once the modifications are complete, to re-enable System Integrity Protection but this time run the command csrutil enable
.
After packaging your game, if you notice any crash or microphone not working then proceed to doing the following steps
locate the Info.plist
file in the packaged game directory. This is typically found by right-clicking the package, selecting 'Show Package Contents', and editing /Contents/Mac/Info.plist
.
Add the following entries to request microphone access:
Save the changes to the Info.plist
file.
This document explains how to set microphone through blueprints
Open the player blueprint or the blueprint that has the Convai Player component in it.
Drag the ConvaiPlayer
component in your Event Graph
and search for Set Capture Device by Name
or Set Capture Device by Index
. These functions are used to set the microphone by its index or by its name.
This document explains how to list available microphone devices.
This document explains how to change the audio gain through blueprints in Unreal Engine.
The Audio Gain feature proves beneficial in scenarios where the incoming audio from the microphone is notably low in volume.
Open the Player blueprint or the blueprint which has the ConvaiPlayer
component in it.
From the ConvaiPlayer component search for Set Microphone Volume Multiplier
function.
Enter any float value in the In Volume Multiplier
property of the function which makes the audio loud and clear.
The following set of guides will help you make a simple project with Convai.
A minimal from scratch example to get you familiar with the plugin main components.
Create a new first person project.
Enable the Convai plugin and add the API key as mentioned here.
Create a new Actor blueprint that we will be using as the AI character.
Open the created blueprint then search for and add the Convai Chatbot
component in the components list.
Note: if you do not find the component then ensure that you have properly installed and enabled the plugin by following the Installation guide.
Select the created component and on the details panel, find the Character ID field and paste your character ID which you can get by creating a new character or using an existing one on the Convai Playground.
Add a box component so that you can see the blueprint when placed in the scene.
Place the blueprint in the scene.
Open the player blueprint which by default in First Person/Blueprints/BP_FirstPersonCharacter
for the first person template.
Search and add the Convai Player
component in the components list.
Add the following blueprint schematic to allow the player to talk to the AI character via the V key:
Add a keyboard key event to be used as a push to talk button (i.e. the `V` key in this example).
Use Convai Get Looked At Character
to get the chatbot component of the character that is currently viewed by the player.
Set Radius
to a reasonable distance or zero if you want the player to be talk the character over an infinite distance.
Set Plane View
to true to only consider the plane axis (X & Y) and ignore the height axis (Z), this is made to prevent having to look directly at the pivot of the character, instead only looking in the direction of the character is sufficent.
Use the Start Talking
node from the Convai Player
component initiates the talking session with the character, ensure you have enabled Voice Response
to get the character to respond vocally.
On the Released
event, use the Finish Talking
node on the Convai Player
component to let the AI character know that we have finished talking and are now waiting for a response.
Hit play, approach the AI character and push T to talk through the microphone, the character should then respond after releasing the T key.
If the character does not respond then make sure your microphone is set properly as the default microphone in the OS settings.
Use the following Send Text
node instead if you want to text chat with the character instead of voice.
Note, over here we use a hard coded string as input to the character, you will need to create the required UI to get the text input from the user and send it to the AI character.
Here are some Guides to get you started with integrating Convai in Unreal Engine.
Easily install the Convai Plugin in Unreal Engine with our step-by-step guide. Set up your API key to add Convai to your Unreal Engine project.
Download Visual Studio from here.
Ensure having the required C++ toolchains mentioned here. If you already have Visual Studio installed, you may open the installer and select 'Modify' to add the above mentioned toolchains.
Download XCode from the app store
For UE 5.3 and UE 5.0, follow this guide to enable microphone permissions.
There are two methods to install the plugin depending on your requirements
Directly from the Marketplace Link: Recommended for easy installation and tracking updates to the plugin. (UE 5.1 - 5.3)
Building from Github Source: For source UE builds and for unsupported UE versions on the marketplace, but not guranteed as the marketplace approach. (UE 4.26 - 5.0)
From the top toolbar, go to Edit > Plugins.
Find the Convai plugin.
Click the checkbox to enable the plugin.
Restart Unreal Engine.
Go to Edit > Project Settings.
Choose Convai under the Plugins section on the left bar.
Paste the API key into the API Key field.
Use a premade player blueprint that already contains a Chat and Settings widget.
The goal of this guide is to easily show how to add a premade chat and settings widget to the UI by re-parenting the player blueprint, we will continue on the progress from the Simple Talking Cube guide.
It is recommended to remove the Convai Player component from the player blueprint that was created in the previous guide before proceeding with this guide.
(Optional) If you do not have a player blueprint already, then you can import a first person or third person content into your already existing project.
From the Content Browser, click the Add button, then Add feature or content pack to the project
then choose either First Person
or Third Person
then click Add to Project
.
Click the Window menu and make sure World Settings
is enabled.
On the World Settings
tab, Find GameMode Override
and set it to BP_FirstPersonGameMode
for first person or BP_thirdPersonGameMode
if you imported third person content.
In the content browser, navigate to your player blueprint, which is by default at FirstPerson/Blueprint/BP_FirstPersonCharacter
for First Person or ThirdPerson/Blueprint/BP_ThirdPersonCharacter
which is the default for Third Person.
Open the blueprint and click Class Settings
then in the Details
section Under Class Options
change the parent class to ConvaiBasePlayer
.
Hit save, compile and hit play to test - use the T key to talk and Enter to text chat.
Hit F10 to open the settings menu where you have various options, like testing your microphone and changing the chat widget layout.
For advanced use cases, you may not want to change the parent class of the player blueprint, in that case we encourage you to use the Convai Base Player
Blueprint itself as reference to see how to add the Chat and Settings widget or even create your own widgets from scratch.
We expect developers who are looking to implement everything from scratch or do customization to be advanced enough to navigate the blueprints on their own.