Please use this identifier to cite or link to this item: https://doi.org/10.1109/TNNLS.2013.2249088
Title: Sparse representation classifier steered discriminative projection with applications to face recognition
Authors: Yang, J.
Chu, D. 
Zhang, L.
Xu, Y.
Yang, J.
Keywords: Dimensionality reduction
discriminant analysis
face recognition
feature extraction
sparse representation
Issue Date: 2013
Citation: Yang, J., Chu, D., Zhang, L., Xu, Y., Yang, J. (2013). Sparse representation classifier steered discriminative projection with applications to face recognition. IEEE Transactions on Neural Networks and Learning Systems 24 (7) : 1023-1035. ScholarBank@NUS Repository. https://doi.org/10.1109/TNNLS.2013.2249088
Abstract: A sparse representation-based classifier (SRC) is developed and shows great potential for real-world face recognition. This paper presents a dimensionality reduction method that fits SRC well. SRC adopts a class reconstruction residual-based decision rule, we use it as a criterion to steer the design of a feature extraction method. The method is thus called the SRC steered discriminative projection (SRC-DP). SRC-DP maximizes the ratio of between-class reconstruction residual to within-class reconstruction residual in the projected space and thus enables SRC to achieve better performance. SRC-DP provides low-dimensional representation of human faces to make the SRC-based face recognition system more efficient. Experiments are done on the AR, the extended Yale B, and PIE face image databases, and results demonstrate the proposed method is more effective than other feature extraction methods based on the SRC. © 2012 IEEE.
Source Title: IEEE Transactions on Neural Networks and Learning Systems
URI: http://scholarbank.nus.edu.sg/handle/10635/104178
ISSN: 2162237X
DOI: 10.1109/TNNLS.2013.2249088
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.