Stereo 3D Lip Tracking

Authors: Gareth Loy, Roland Göcke, Sebastien Rougeaux, and Alexander Zelinsky

Presented by Gareth Loy at the 6th International Conference on Control, Automation, Robotics and Vision ICARCV2000, Singapore, 5-8 December 2000

Abstract

A system is presented that tracks in 3D a person's unadorned lips, and outputs the 3D locations of the mouth corners and ten points describing the outer lip contour. This output is suitable for audio visual speech processing, 3D animation, or expression recognition. A stereo head tracker is used to track the subject's head, allowing for robust performance whilst the subject's head is moving and turning with respect to the cameras. The head pose is used in conjunction with the novel adaptable templates to generate a robust estimate of the deforming mouth corner locations. A 3D geometric model is used to generate search paths for key points on the outer lip contour which are subsequently located using adaptable templates and geometric constraints. The system is demonstrated robustly tracking the head pose and 3D mouth shape on a person speaking while moving his head.

Download (347k, PDF)

[Homepage] [Research] [Publications]


(c) Roland Göcke
Last modified: Wed Nov 24 13:29:53 AUS Eastern Daylight Time 2004