Please use this identifier to cite or link to this item: https://doi.org/10.1016/j.neucom.2009.12.032
Title: Principal component analysis based on non-parametric maximum entropy
Authors: He, R.
Hu, B.
Yuan, X. 
Zheng, W.-S.
Keywords: Entropy
Information theoretic learning
PCA
Subspace learning
Issue Date: Jun-2010
Citation: He, R., Hu, B., Yuan, X., Zheng, W.-S. (2010-06). Principal component analysis based on non-parametric maximum entropy. Neurocomputing 73 (10-12) : 1840-1852. ScholarBank@NUS Repository. https://doi.org/10.1016/j.neucom.2009.12.032
Abstract: In this paper, we propose an improved principal component analysis based on maximum entropy (MaxEnt) preservation, called MaxEnt-PCA, which is derived from a Parzen window estimation of Renyi's quadratic entropy. Instead of minimizing the reconstruction error either based on L2-norm or L1-norm, the MaxEnt-PCA attempts to preserve as much as possible the uncertainty information of the data measured by entropy. The optimal solution of MaxEnt-PCA consists of the eigenvectors of a Laplacian probability matrix corresponding to the MaxEnt distribution. MaxEnt-PCA (1) is rotation invariant, (2) is free from any distribution assumption, and (3) is robust to outliers. Extensive experiments on real-world datasets demonstrate the effectiveness of the proposed linear method as compared to other related robust PCA methods. © 2010 Elsevier B.V.
Source Title: Neurocomputing
URI: http://scholarbank.nus.edu.sg/handle/10635/82929
ISSN: 09252312
DOI: 10.1016/j.neucom.2009.12.032
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.