Please use this identifier to cite or link to this item:
|Title:||Convergent 2-D subspace learning with null space analysis|
Multiview face recognition
Null space LDA
|Source:||Xu, D., Yan, S., Lin, S., Huang, T.S. (2008-12). Convergent 2-D subspace learning with null space analysis. IEEE Transactions on Circuits and Systems for Video Technology 18 (12) : 1753-1759. ScholarBank@NUS Repository. https://doi.org/10.1109/TCSVT.2008.2005799|
|Abstract:||Recent research has demonstrated the success of supervised dimensionality reduction algorithms 2DLDA and 2DMFA, which are based on the image-as-matrix representation, in small sample size cases. To solve the convergence problem in 2DLDA and 2DMFA, we propose in this work two new schemes, called Null Space based 2DLDA (NS2DLDA) and Null Space based 2DMFA (NS2DMFA), and apply them to the challenging multi-view face recognition task. First, we convert each 2-D face image (matrix) into a vector and compute the first projection matrix P 1 from the null space of the intra-class scatter matrix, such that the samples from the same class are projected to the same point. Then the data are projected and reconstructed with P 1. Finally, we re-organize the reconstructed datum into a matrix and then compute the second projection direction P 2, in the form of a Kronecker product of two matrices, by maximizing the inter-class scatter. A proof of algorithmic convergence is provided. The experiments on two benchmark multi-view face databases, the CMU PIE and FERET databases, demonstrate that NS2DLDA outperforms Fisherface, Null Space LDA (NSLDA) and 2DLDA. Additionally, NS2DMFA is also demonstrated to be more accurate than MFA and 2DMFA for face recognition. © 2008 IEEE.|
|Source Title:||IEEE Transactions on Circuits and Systems for Video Technology|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Feb 13, 2018
WEB OF SCIENCETM
checked on Jan 10, 2018
checked on Feb 18, 2018
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.