Saturday, April 27, 2013

Lip Sync with Blender and Papagayo

 

Papagayo is an open source and freely redistributable lip sync tool by Lost Marble. Lip syncing is the most boring thing you can do by hand when animating... I have created python script to take .dat files exported from Papagayo and import them as position keyframes on a control object which drives (using drivers ;) the morph targets of a face animation rig.

The linked zip file has the script, a slightly modified version of Papagayo (with my face rig images instead of the stock distro version) as well as a blend file with my full (simple and low poly) facial rig. See the included readme.txt for a step by step how to use this script.


The version here is slightly modified only by the addition of a new set of default 'face shape' images to match my Blender face rig.

The included Blend file: Face Rig - rev004.blend has the facial animation rig I use on all my characters, applied here to a slightly modified Suzanne monkey head. The face rig is obviously low poly and simple (I like to avoid the 'uncanny valley' effect) however this rig and script could be adapted to a more complex face rig if you were so inclined.

  • Step 1: Install the Papagayo application 
  • Step 2: Learn how to use Papagayo to sync up text to a sound .wav file Export a .dat file from Papagayo Step 3: Copy the Papagayo_Import.py script to blender's add-ons directory 
  • Step 4: Activate the script in the Addons tab of the user preferences window. You'll see a little control box appear in the 3D view tool shelf ("T" to bring this up) 
  • Step 5: Load the Face Rig .blend file Select the "Plus" control called "face.mouth.ctrl" 
  • Step 6: Pick a .dat file (saved from papagayo above) and click on "Process Input File"
  • Step 7: A bunch of keyframes should appear in the timeline. Scrub back and forth to see the keyframes in action You can load in your wav file into blender at this point to hear the audio synced to the mouth (but this isn't necessary for the script to function)

6 comments:

Anonymous said...

I am having problems with your importer. I can get it to shake the character about, but not move his mouth in a lip sync fashion. Any suggestions?

David T. Krupicz said...

You need to have the mouth control object (the "plus" shaped crosshairs object in the mouth XY contol) selected before you hit the 'apply' button in the script.

Anonymous said...

Thanks for the quick reply, I don't think I made my problem clear though. I wanted the importer to apply the shape keys I had made, to the rig, on the keyframe timeline for animation. What it appears to do is apply an x and z location offset to whatever is selected, not the actual "AI O E U" etc. shapekeys. I can get your rig to animate the crosshairs ok, and they do jump to the correct phoneme letters. Its as though the script is saying "if the input file asks for the MBP phoneme, apply an offset to the x-z location of this magnitude" (which is where the letters MBP will be found on the screen)

David T. Krupicz said...

Ah, you have to link the XY control to the morph targets using Drivers. for my mouth shape (kept very simple...) I have 2 morph targets (aka shape keys in blender...), which I can push in positive or negative influence, the position of the XY control drives these shape keys. I mapped the position of the XY control to the set of shapes that Papagayo exports, as best as I could.

If you have more than two shape keys for your model then it might not work too well... This is why I keep my face rig as simple as possible...

Anonymous said...

That's a bit of a problem then, I have 9 shapes corresponding to the standard mouth shapes. How would you manage with only one or two? "link the XY control etc" is Japanese to me I am afraid. Perhaps a step by step idiot sheet included in your readme.txt might be a good plan, I managed to follow all those instructions easily, and was hoping the result would be an animated monkey head speaking the loaded Papagayo file. As you might have gathered, my skill level is minimal at the moment. Best regards, Norm

David T. Krupicz said...

5 shapekeys eh? If you wanted to use my method then you'd have to use all 3 position X Y Z as well as two of the rotation axis for the control object. 'Linking' the position of this to the shape keys is doen with drivers, you set up a driver which takes the distance between a stationary object and the control object, and uses that to tell the shape key how much (0-100 pct.) it should be. See my blend file to see how I set up the drivers, it's a bit of an intermediate level blender use, I'll admit, but not too difficult to master once you play around with it a bit. Save yourself some headaches and use whole blender units (0..1) which map directly to the shapekeys 0..1, instead of the complicated fractions I used...