Please use this identifier to cite or link to this item: https://scholarbank.nus.edu.sg/handle/10635/146378
Title: Principal component analysis for facial animation
Authors: Goudeaux K.
Chen T. 
Wang S.-W.
Liu J.-D.
Issue Date: 2001
Citation: Goudeaux K., Chen T., Wang S.-W., Liu J.-D. (2001). Principal component analysis for facial animation. ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings 3 : 1501-1504. ScholarBank@NUS Repository.
Abstract: This paper presents a technique for animating a three-dimensional face model through the application of Principal Component Analysis (PCA). Using PCA has several advantages over traditional approaches to facial animation because it reduces the number of parameters needed to describe a face and confines the facial motion to a valid space to prevent unnatural contortions. First real data is optically captured in real time from a human subject using infrared cameras and reflective trackers. This data is analyzed to find a mean face and a set of eigenvectors and eigenvalues that are used to perturb the mean face within the range described by the captured data. The result is a set of vectors that can be linearly combined and interpolated to represent different facial expressions and animations. We also show that it is possible to map the eigenvectors of one face onto another face or to change the eigenvectors to describe new motion.
Source Title: ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
URI: http://scholarbank.nus.edu.sg/handle/10635/146378
ISSN: 15206149
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.