Chimera Bridge

Chimera LipSync Bridge

Bridge audio to MetaHuman lip sync

Chimera LipSync Bridge

The ChimeraLipSyncBridgeComponent connects audio streams to Epic's RuntimeMetaHumanLipSync system, enabling real-time lip synchronization for MetaHuman characters.

Properties

PropertyTypeDescription
TargetMetaHumanActorAActor*Reference to the MetaHuman actor (auto-discover if empty)
FaceMeshSearchTagFStringSubstring to find face mesh component ("Face")
TargetFaceMeshNameFNameExact name override for face mesh
bAutoWireToAnimBPboolAutomatically inject LipSyncGenerator
InitialMoodELipSyncMoodStarting mood (Neutral/Happy/Sad/Angry)

Events

OnLipSyncStarted

Fired when lip sync processing begins.

OnLipSyncStopped

Fired when lip sync processing stops.

How It Works

  1. Audio Reception: Publisher receives audio from the server
  2. Audio Routing: Audio is passed to the LipSync Bridge
  3. Phoneme Analysis: RuntimeMetaHumanLipSync analyzes the audio
  4. Animation: Face mesh blendshapes are driven in real-time

Auto-Wiring

When bAutoWireToAnimBP is enabled (default), the component automatically:

  1. Finds the MetaHuman's Face mesh component
  2. Creates a RuntimeMetaHumanLipSync generator
  3. Injects it into the Animation Blueprint
  4. Starts processing when audio arrives

Setup Guide

Prerequisites

  • MetaHuman character in your level
  • RuntimeMetaHumanLipSync plugin enabled
  • Chimera Publisher component receiving audio
  1. Add ChimeraLipSyncBridgeComponent to your actor
  2. Set TargetMetaHumanActor to your MetaHuman (or leave empty for auto-discovery)
  3. Keep bAutoWireToAnimBP enabled
  4. The component handles everything else automatically

Manual Setup

// In your Actor's BeginPlay
LipSyncBridge = FindComponentByClass<UChimeraLipSyncBridgeComponent>();
Publisher = FindComponentByClass<UChimeraPublisherComponent>();
 
// Set target MetaHuman
LipSyncBridge->TargetMetaHumanActor = MyMetaHumanActor;
 
// Audio routing is automatic via OnAudioDataReceived binding

Blueprint Setup

  1. Add Chimera LipSync Bridge to your actor
  2. In Details panel, set Target MetaHuman Actor
  3. Keep Auto Wire to AnimBP enabled
  4. The component automatically binds to the Publisher's OnAudioDataReceived

Troubleshooting

Lip sync not moving

  1. Verify MetaHuman has a valid Face mesh component
  2. Check that audio is being received (OnAudioDataReceived firing)
  3. Ensure RuntimeMetaHumanLipSync plugin is enabled
  4. Check Output Log for "LipSync" messages

Face mesh not found

  1. Set FaceMeshSearchTag to match your setup (default: "Face")
  2. Or use TargetFaceMeshName for exact match
  3. Check that MetaHuman Blueprint has the expected structure

Audio delay

The lip sync has minimal latency by design. If you notice delay:

  1. Check overall audio pipeline latency
  2. Verify sample rates match (48000 Hz recommended)

On this page