Creating Lip Sync for 3D Characters

For a project I’m working on, we need to create a bunch of lifelike 3D people that speak to each other. This means animating lip sync to a voiceover track. Since this is such a labor intensive job, I did a little hunting around for applications to make this job easier.

First, I tried a quick test in Poser. I wasn’t super-impressed with the results because the facial morph would drift over time. Check this QuickTime sample. It could be that I need to learn how to tweak the software, or the resulting keyframing, but the initial result didn’t seem close enough to warrant further exploration.

The Softimage site has several examples of 3D characters talking and showing emotion for their FaceRobot software. You can see the videos at http://www.softimage.com/products/facerobot/videos.aspx.

Perhaps the best example out of all of the videos listed there is “Kitty Hunting: Side by Side”, mainly because the character demonstrates several different emotions, and you can compare it directly to the real actor on which it was based.

Very impressive, but even more impressive is what Image Metrics is doing with a regular video camera. Their proprietary software that allows them to shoot regular video of an actor’s face (without any tracking markers) and match those facial movements perfectly to a 3d character. They claim that the result is the most realistic facial animation you can find.

For a video explaining what they do check out this link: http://abcnews.go.com/Video/playerIndex?id=2644947

Their website is at http://www.image-metrics.com

I have no idea what it costs, but I’m looking into it.