Please use this identifier to cite or link to this item:
https://doi.org/10.1016/S0167-8655(02)00079-X
Title: | A hybrid approach of NN and HMM for facial emotion classification | Authors: | Hu, T. De Silva, L.C. Sengupta, K. |
Keywords: | Facial emotion classification Gabor wavelets Hidden Markov Model Neural network |
Issue Date: | Sep-2002 | Citation: | Hu, T., De Silva, L.C., Sengupta, K. (2002-09). A hybrid approach of NN and HMM for facial emotion classification. Pattern Recognition Letters 23 (11) : 1303-1310. ScholarBank@NUS Repository. https://doi.org/10.1016/S0167-8655(02)00079-X | Abstract: | Neural networks (NNs) are often combined with Hidden Markov Models (HMMs) in speech recognition for achieving superior performance. In this paper, this hybrid approach is employed in facial emotion classification. Gabor wavelets are employed to extract features from difference images obtained by subtracting the first frame showing a frontal face from the current frame. The NN, which takes the form of Multilayer perceptron (MLP), is used to classify the feature vector into different states of a HMM of a certain emotion sequence, i.e., neutral, intermediate and peak. In addition to using 1-0 as targets for the NN, a heuristic strategy of assigning variable targets 1-x-0 has also been applied. After training, we interpret the output values of the NN as the posterior of the HMM state and directly apply the Viterbi algorithm to these values to estimate the best state path. The experiments show that with variable targets for the NN, the HMM gives better results than that with 1-0 targets. The best HMM results are obtained for x = 0.8 in 1-x-0. © 2002 Elsevier Science B.V. All rights reserved. | Source Title: | Pattern Recognition Letters | URI: | http://scholarbank.nus.edu.sg/handle/10635/54259 | ISSN: | 01678655 | DOI: | 10.1016/S0167-8655(02)00079-X |
Appears in Collections: | Staff Publications |
Show full item record
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.