Please use this identifier to cite or link to this item:
|Title:||Active affective facial analysis for human-robot interaction|
|Authors:||Ge, S.S. |
Facial expression recognition
Facial feature analysis
|Source:||Ge, S.S.,Samani, H.A.,Ong, Y.H.J.,Hang, C.C. (2008). Active affective facial analysis for human-robot interaction. Proceedings of the 17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN : 83-88. ScholarBank@NUS Repository. https://doi.org/10.1109/ROMAN.2008.4600647|
|Abstract:||In this paper, we present an active vision system for human-robot interaction purposes that includes robust face detection, tracking, recognition and facial expression analysis. The system will search for human faces in view, zoom on the face of interest based on the face recognition database, track it and finally analyze the emotion parameters on the face. After detection using Haar-cascade classifiers, the variable parameters of the camera are changed adaptively to track the face of the subject by employing the Camshift algorithm, and to extract the facial features which are used for face recognition and facial expression analysis. Embedded Hidden Markov Model is used for face recognition and nonlinear facial mass-spring model is employed to describe the facial muscle's tension. The motion signatures are then classified using Multi-layer Perceptrons for facial expression analysis. This system can be used as a comprehensive and robust vision package for a robot to interact with human beings. © 2008 IEEE.|
|Source Title:||Proceedings of the 17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Jan 16, 2018
checked on Jan 19, 2018
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.