Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Adding Characters to Scene - PlayCanvas Plugin Guide for Convai integration.
Add character animations in PlayCanvas with Convai. Enhance your web projects with interactive AI.
Upon completing the creation/upload of all desired animations within the PlayCanvas environment, construct an Animation State Graph to effectively manage and control the animation states and transitions
Reference : https://developer.playcanvas.com/tutorials/anim-blending/
Attach the state graph and animation files to the character. Create a Anim component and drag-drop the files to the placeholders.
Adding External Script - PlayCanvas Plugin Guide for Convai integration.
Here we will add the Convai Web SDK to our project, a JavaScript library that enables integration of conversational AI capabilities.
Create a blank project after loggin in.
Open up settings section in SETTINGS panel.
Increase the Array Size to 1 for adding 1 external script.
Add Convai web-sdk-cdn link in url section.
PlayCanvas template for Convai integration.
ConvAI is a powerful tool that enables developers to incorporate natural language processing (NLP) and conversational AI capabilities into their PlayCanvas projects. By following this guide, you'll learn how to seamlessly integrate ConvAI into your PlayCanvas project, allowing you to create engaging and interactive experiences for your users.
To help you get started, we've created a reference PlayCanvas project that demonstrates the integration of ConvAI. This project serves as a foundation for you to build upon and understand the necessary steps to incorporate ConvAI into your own projects.
Project Link :
Convai Integration - PlayCanvas Plugin Guide for seamless integration.
After the addition of Convai web-sdk-cdn to url section, ConvaiClient class will be available to the browser directly.
Add all the scripts bellow to your Character entity.
Replace the empty "" with you API key and Character ID.
The ConvaiNpc
script is responsible for handling the interaction between the user and a virtual character powered by the Convai AI.
The script initializes the Convai client by providing an API key and character ID. It sets up necessary callbacks to handle various events, such as errors, user queries, and audio responses from the Convai service.
The initializeConvaiClient
function is the entry point for setting up the Convai client. It creates a new instance of the ConvaiClient
and configures it with the provided API key, character ID, and other settings like enabling audio and facial data.
The script handles user input through two methods: text input via a form and voice input using the "T" key. For voice input, the handleKeyDown
and handleKeyUp
functions are used to detect when the "T" key is pressed and released, respectively. When the "T" key is pressed, the script starts recording audio and sends it to the Convai service for processing.
The ConvaiNpc.prototype.initialize
function is called once per entity and sets up the Convai client. It also registers callbacks for handling audio playback events, updating the isTalking
and conversationActive
flags accordingly.
The ConvaiNpc.prototype.handleAnimation
function updates the character's animation based on the isTalking
state, allowing for synchronized lip movements and facial expressions.
The PlayerAnimationHandler
script is responsible for controlling the animations of a player character based on certain conditions, such as velocity or other factors.
The script defines three attributes:
blendTime
: This attribute controls the blend time between animations, which determines how smoothly the transition between animations occurs. The default value is set to 0.2.
velMin
: This attribute represents the minimum velocity required to trigger a specific animation. The default value is set to 10.
velMax
: This attribute represents the maximum velocity required to trigger a specific animation. The default value is set to 50.
These attributes can be adjusted in the editor or through code to fine-tune the animation behavior for the player character.
The initialize
function is called when the script is initialized. In this implementation, it plays the 'Idle' animation with the specified blend time (this.blendTime
). This animation will be played when the player character is not moving or when the velocity is outside the range defined by velMin
and velMax
.
The script is designed to be extended further to handle different animation states based on the player character's velocity or other conditions. For example, you could add additional functions or logic to check the player's velocity and play different animations (e.g., 'Walk', 'Run') based on the velocity range defined by velMin
and velMax
.
By utilizing this script, you can easily manage and transition between different animations for the player character, providing a more immersive and realistic experience in your game or application.
The Lipsync
script is responsible for animating the character's mouth and facial expressions based on the received viseme data. Visemes are the key mouth shapes and facial positions used to represent speech sounds. The script applies morph target animations to the character's head and teeth components to achieve realistic lip-syncing effects.
The script works by accessing the visemeData
array, which contains the viseme weights for each frame of the animation. It then applies these weights to the corresponding morph targets on the head and teeth components. The runVisemeData
function handles this process by looping through the viseme weights and setting the morph target weights accordingly.
The script keeps track of the current viseme frame using the currentVisemeFrame
variable and a timer variable. This ensures that the viseme animations are synchronized with the audio playback. When the viseme data has finished playing, the zeroMorphs
function is called to reset all morph target weights to zero, effectively resetting the character's facial expression.
The HeadTracking
script is responsible for controlling the rotation of a character's head and eyes based on the position of the camera (representing the user's viewpoint). The script achieves this by calculating the angle between the forward vector of the head component and the forward vector of the camera. If this angle is within a specified threshold (45 degrees in this case), the head and eyes are rotated to look towards the camera's position.
Add all the above scripts to your playcanvas project and attach convaiNPC, lipsync, Headtracking to the convi (your model) component.
Chat Overlay - PlayCanvas Plugin Guide for Convai integration.
Let's add a chat window to enhance user interaction and immersion. Create a New entity called Convai Chat.
The ConvaiChat
script is responsible for managing the chat interface and displaying the conversation between the user and an AI-powered virtual character. It handles rendering user messages and AI responses, maintaining a chat history, and ensuring smooth scrolling behavior within the chat container.
Add the following files as an attachment to convaiChat script after parsing the script.
First Person View - PlayCanvas Plugin Guide for Convai integration.
Scale up the plane and add physics component to it. Add both collision and rigidbody.
Import Ammo Js (enables physics). Scale up the Collision component according to the plane size.
Create a New Entity
In the PlayCanvas Editor, right-click in the Hierarchy panel and select "Create New Entity".
Add Physics Component
With the new entity selected, click the "Add Component" button in the top-right corner of the Editor.
Search for "Physics" and add the "Physics" component to the entity.
Add Collision Capsule Component
With the entity still selected, click "Add Component" again.
Search for "Collision" and add the "Collision Capsule" component to the entity.
Adjust Entity Y-Position
Adjust the "Y" value of the Translation to position the entity above the plane.
Add Rigidbody Component
With the entity still selected, click "Add Component" again.
Search for "Rigidbody" and add the "Rigidbody" component to the entity.
Set the "Type" of the rigidbody to "Dynamic".
Adjust Angular Factors
In the Rigidbody component, locate the "Angular Factor" section.
Set the "X", "Y", and "Z" values of the Angular Factor to 0.
Create a FirstPersonView.js script and add the bellow code. You can find examples for implementing camera controls on and examples as well. Attach this script to Player capsule.