Custom UI & Advanced Usage
Build your own UI while using Convai’s audio pipelines and message system.
Required for Custom UIs: AudioRenderer
If you don’t use the widget, include AudioRenderer so bot audio works.
import { useConvaiClient, AudioRenderer, AudioContext } from '@convai/web-sdk';
function CustomApp() {
const convaiClient = useConvaiClient({
apiKey: 'your-api-key',
characterId: 'your-character-id'
});
return (
<AudioContext.Provider value={convaiClient.room}>
<AudioRenderer /> {/* Required for audio playback */}
<YourCustomUI />
</AudioContext.Provider>
);
}Custom Message List
function Messages({ convaiClient }) {
return convaiClient.chatMessages.map(msg => (
<div key={msg.id} className={msg.type}>
{msg.content}
</div>
));
}Custom Controls
function Controls({ convaiClient }) {
const { audioControls, videoControls, screenShareControls } = convaiClient;
return (
<>
<button onClick={audioControls.toggleAudio}>
{audioControls.isAudioMuted ? 'Unmute' : 'Mute'}
</button>
<button onClick={videoControls.toggleVideo}>
{videoControls.isVideoEnabled ? 'Stop Camera' : 'Start Camera'}
</button>
<button onClick={screenShareControls.toggleScreenShare}>
{screenShareControls.isScreenShareActive ? 'Stop Sharing' : 'Share Screen'}
</button>
</>
);
}Message Types
The SDK produces:
user— User messagesuser-transcription— Live ASRbot-llm-text— Character textbot-emotion— Emotion signalsaction— Action triggersbehavior-tree— Behavior responses
Best Practices
Use a single
useConvaiClient()per application.Always check
state.isConnectedbefore sending messages.Call
resetSession()when switching scenes or flows.Use
<AudioRenderer />for custom UIs.Provide
endUserIdin production for memory and analytics.Wrap connect/disconnect in
try/catch.
Last updated
Was this helpful?