![]() The lip sync algorithm works best when it has healthy signal without unwanted audio to analyze. This problem is less noticeable/distracting, but still worth considering. ![]() Lips not moving when they should be, or showing the incorrect viseme Make the neutral mouth(s) triggerable so someone can mute the puppet’s mouth when they shouldn’t be talking.Do not send this same audio to the final broadcast. Apply an audio gate and a high pass filter to the microphone signal before it gets to Character Animator.Have the voice talent use a microphone or headset with excellent off‐axis rejection. The microphone 'driving' the lip sync doesn’t need to be the same microphone doing the broadcast audio.Reduce the level of audio coming into the machine running Character Animator.Isolate your voice talent with a sound booth.It is less common in streaming scenarios, but can happen if unwanted audio enters the microphone. This problem usually happens in a noisy environment, such as on any stage. ![]() The most common problems with lip sync in live scenarios are the lips moving when they should be still, the lips staying still when they should be moving, and poor synchronization between the audio and the mouth moving. Tune by clapping into the microphone and lining up the clap sound visually with the mouth changing (as viewed by the person watching at the other end, e.g., over Facebook Live). Audio needs delaying in the software switcher (OBS / Wirecast), usually by a frame or two (around 60 milliseconds).Be ready to adjust the audio input gain on‐the‐fly if needed (e.g., voice talent move back from microphone, or gets excited and starts yelling).This will reduce leakage of other voices into the microphone, but requires that you have set good input levels. Test to see if turning off the Camera & Microphone panel’s Auto-enhance Audio Input option helps lip-sync accuracy.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |