Please use this identifier to cite or link to this item: https://doi.org/10.1109/TIT.2012.2212415
Title: Outlier-robust PCA: The high-dimensional case
Authors: Xu, H. 
Caramanis, C.
Mannor, S.
Keywords: Dimension reduction
outlier
principal component analysis (PCA)
robustness
statistical learning
Issue Date: 2013
Citation: Xu, H., Caramanis, C., Mannor, S. (2013). Outlier-robust PCA: The high-dimensional case. IEEE Transactions on Information Theory 59 (1) : 546-572. ScholarBank@NUS Repository. https://doi.org/10.1109/TIT.2012.2212415
Abstract: Principal component analysis plays a central role in statistics, engineering, and science. Because of the prevalence of corrupted data in real-world applications, much research has focused on developing robust algorithms. Perhaps surprisingly, these algorithms are unequipped-indeed, unable-to deal with outliers in the high-dimensional setting where the number of observations is of the same magnitude as the number of variables of each observation, and the dataset contains some (arbitrarily) corrupted observations. We propose a high-dimensional robust principal component analysis algorithm that is efficient, robust to contaminated points, and easily kernelizable. In particular, our algorithm achieves maximal robustness-it has a breakdown point of 50% (the best possible), while all existing algorithms have a breakdown point of zero. Moreover, our algorithm recovers the optimal solution exactly in the case where the number of corrupted points grows sublinearly in the dimension. © 2012 IEEE.
Source Title: IEEE Transactions on Information Theory
URI: http://scholarbank.nus.edu.sg/handle/10635/61041
ISSN: 00189448
DOI: 10.1109/TIT.2012.2212415
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.