Please use this identifier to cite or link to this item:
https://doi.org/10.1109/ROMAN.2008.4600647
DC Field | Value | |
---|---|---|
dc.title | Active affective facial analysis for human-robot interaction | |
dc.contributor.author | Ge, S.S. | |
dc.contributor.author | Samani, H.A. | |
dc.contributor.author | Ong, Y.H.J. | |
dc.contributor.author | Hang, C.C. | |
dc.date.accessioned | 2014-04-24T08:33:13Z | |
dc.date.available | 2014-04-24T08:33:13Z | |
dc.date.issued | 2008 | |
dc.identifier.citation | Ge, S.S.,Samani, H.A.,Ong, Y.H.J.,Hang, C.C. (2008). Active affective facial analysis for human-robot interaction. Proceedings of the 17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN : 83-88. ScholarBank@NUS Repository. <a href="https://doi.org/10.1109/ROMAN.2008.4600647" target="_blank">https://doi.org/10.1109/ROMAN.2008.4600647</a> | |
dc.identifier.isbn | 9781424422135 | |
dc.identifier.uri | http://scholarbank.nus.edu.sg/handle/10635/51101 | |
dc.description.abstract | In this paper, we present an active vision system for human-robot interaction purposes that includes robust face detection, tracking, recognition and facial expression analysis. The system will search for human faces in view, zoom on the face of interest based on the face recognition database, track it and finally analyze the emotion parameters on the face. After detection using Haar-cascade classifiers, the variable parameters of the camera are changed adaptively to track the face of the subject by employing the Camshift algorithm, and to extract the facial features which are used for face recognition and facial expression analysis. Embedded Hidden Markov Model is used for face recognition and nonlinear facial mass-spring model is employed to describe the facial muscle's tension. The motion signatures are then classified using Multi-layer Perceptrons for facial expression analysis. This system can be used as a comprehensive and robust vision package for a robot to interact with human beings. © 2008 IEEE. | |
dc.description.uri | http://libproxy1.nus.edu.sg/login?url=http://dx.doi.org/10.1109/ROMAN.2008.4600647 | |
dc.source | Scopus | |
dc.subject | Emotion recognition | |
dc.subject | Facial expression recognition | |
dc.subject | Facial feature analysis | |
dc.subject | Human-robot interaction | |
dc.subject | Social robot | |
dc.type | Conference Paper | |
dc.contributor.department | DIVISION OF ENGINEERING AND TECH MGT | |
dc.contributor.department | ELECTRICAL & COMPUTER ENGINEERING | |
dc.description.doi | 10.1109/ROMAN.2008.4600647 | |
dc.description.sourcetitle | Proceedings of the 17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN | |
dc.description.page | 83-88 | |
dc.identifier.isiut | NOT_IN_WOS | |
Appears in Collections: | Staff Publications |
Show simple item record
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.