Automatic tracking of the human face has been used for a variety of applications, such as identification and gesture recognition. We have developed a system which not only tracks the subject's face in real time, but also estimates where the subject is looking.
As people interact face-to-face on a dailey basis they are constantly aware of where the other person's eyes are looking, indeed eye contact is a crucial part of effective communication. A computer which can 'read' our eyes and tell what we are looking at is a definite step towards more sophisticated human-machine interaction.
A vision system capable of tracking the human face and estimating the person's gaze point in real time offers many possiblilities for enhancing human-machine interaction. A key feature is such a system is its entirely non-intrusive nature, enabling people to be observed in their 'natural' state. It does not require special light shone on the target or that the target wear any special devices.
Applications include: -
The above image shows a video still of the subject's face with the tracking features indicated by white rectangles and the pupil positions by white circles. The pose is graphically represented by animating a computer generated 3D model, as shown in the image below.
At present the system needs to be set up and calibrated for each individual whose face is to be tracked. We would like to extend the system so it can track anybody who passes in front of the camera. To achieve this the following extensions need to be made to the system : -
The initial implementation of this system was completed by Dr. Yoshio Matsumoto in 1998. Since then Dr. Rhys Newman and Dr. Sebastien Rougeaux have further developed and refined the system. Gareth Loy is currently working on an automatic initialisation process for the system.