Please use this identifier to cite or link to this item: https://doi.org/10.1109/TIP.2006.875225
Title: Face recognition using recursive fisher linear discriminant
Authors: Xiang, C. 
Fan, X.-A.
Lee, T.H. 
Keywords: Face recognition
Feature extraction
Fisher linear discriminant (FLD)
Principal component analysis (PCA)
Recursive fisher linear discriminant (RFLD)
Issue Date: Aug-2006
Citation: Xiang, C., Fan, X.-A., Lee, T.H. (2006-08). Face recognition using recursive fisher linear discriminant. IEEE Transactions on Image Processing 15 (8) : 2097-2105. ScholarBank@NUS Repository. https://doi.org/10.1109/TIP.2006.875225
Abstract: Fisher linear discriminant (FLD) has recently emerged as a more efficient approach for extracting features for many pattern classification problems as compared to traditional principal component analysis. However, the constraint on the total number of features available from FLD has seriously limited its application to a large class of problems. In order to overcome this disadvantage, a recursive procedure of calculating the discriminant features is suggested in this paper. The new algorithm incorporates the same fundamental idea behind FLD of seeking the projection that best separates the data corresponding to different classes, while in contrast to FLD the number of features that may be derived is independent of the number of the classes to be recognized. Extensive experiments of comparing the new algorithm with the traditional approaches have been carried out on face recognition problem with the Yale database, in which the resulting improvement of the performances by the new feature extraction scheme is significant. © 2006 IEEE.
Source Title: IEEE Transactions on Image Processing
URI: http://scholarbank.nus.edu.sg/handle/10635/50921
ISSN: 10577149
DOI: 10.1109/TIP.2006.875225
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.