ISCA Archive Eurospeech 1997
ISCA Archive Eurospeech 1997

An hybrid image processing approach to liptracking independent of head orientation

Lionel Reveret, Frederique Garcia, Christian Benoit, Eric Vatikiotis-Bateson

This paper examines the influence of head orientation in liptracking. There are two main conclusions: First, lip gesture analysis and head movement correction should be processed independently. Second, the measurement of articulatory parameters may be corrupted by head movement if it is performed directly at the pixel level. We thus propose an innovative technique of liptracking which relies on a "3D active contour" model of the lips controlled by articulatory parameters. The 3D model is projected onto the image of a speaking face through a camera model, thus allowing spatial re-orientation of the head. Liptracking is then performed by automatic adjustment of the control parameters, independently of head orientation. The final objective of our study is to apply a pixel-based method to detect head orientation. Nevertheless, we consider that head motion and lip gestures are detected by different processes, whether cognitive (by humans) or computational (by machines). Due to this, we decided to first develop and evaluate orientation-free liptracking through a non video-based head motion detection technique which is here presented.