Please use this identifier to cite or link to this item: https://doi.org/10.1109/ROMAN.2008.4600647
DC FieldValue
dc.titleActive affective facial analysis for human-robot interaction
dc.contributor.authorGe, S.S.
dc.contributor.authorSamani, H.A.
dc.contributor.authorOng, Y.H.J.
dc.contributor.authorHang, C.C.
dc.date.accessioned2014-04-24T08:33:13Z
dc.date.available2014-04-24T08:33:13Z
dc.date.issued2008
dc.identifier.citationGe, S.S.,Samani, H.A.,Ong, Y.H.J.,Hang, C.C. (2008). Active affective facial analysis for human-robot interaction. Proceedings of the 17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN : 83-88. ScholarBank@NUS Repository. <a href="https://doi.org/10.1109/ROMAN.2008.4600647" target="_blank">https://doi.org/10.1109/ROMAN.2008.4600647</a>
dc.identifier.isbn9781424422135
dc.identifier.urihttp://scholarbank.nus.edu.sg/handle/10635/51101
dc.description.abstractIn this paper, we present an active vision system for human-robot interaction purposes that includes robust face detection, tracking, recognition and facial expression analysis. The system will search for human faces in view, zoom on the face of interest based on the face recognition database, track it and finally analyze the emotion parameters on the face. After detection using Haar-cascade classifiers, the variable parameters of the camera are changed adaptively to track the face of the subject by employing the Camshift algorithm, and to extract the facial features which are used for face recognition and facial expression analysis. Embedded Hidden Markov Model is used for face recognition and nonlinear facial mass-spring model is employed to describe the facial muscle's tension. The motion signatures are then classified using Multi-layer Perceptrons for facial expression analysis. This system can be used as a comprehensive and robust vision package for a robot to interact with human beings. © 2008 IEEE.
dc.description.urihttp://libproxy1.nus.edu.sg/login?url=http://dx.doi.org/10.1109/ROMAN.2008.4600647
dc.sourceScopus
dc.subjectEmotion recognition
dc.subjectFacial expression recognition
dc.subjectFacial feature analysis
dc.subjectHuman-robot interaction
dc.subjectSocial robot
dc.typeConference Paper
dc.contributor.departmentDIVISION OF ENGINEERING AND TECH MGT
dc.contributor.departmentELECTRICAL & COMPUTER ENGINEERING
dc.description.doi10.1109/ROMAN.2008.4600647
dc.description.sourcetitleProceedings of the 17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN
dc.description.page83-88
dc.identifier.isiutNOT_IN_WOS
Appears in Collections:Staff Publications

Show simple item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.