Please use this identifier to cite or link to this item: https://doi.org/10.1109/ICDMW.2010.64
DC FieldValue
dc.titleRobust low-rank subspace segmentation with semidefinite guarantees
dc.contributor.authorNi, Y.
dc.contributor.authorSun, J.
dc.contributor.authorYuan, X.
dc.contributor.authorYan, S.
dc.contributor.authorCheong, L.-F.
dc.date.accessioned2013-07-23T09:30:44Z
dc.date.available2013-07-23T09:30:44Z
dc.date.issued2010
dc.identifier.citationNi, Y., Sun, J., Yuan, X., Yan, S., Cheong, L.-F. (2010). Robust low-rank subspace segmentation with semidefinite guarantees. Proceedings - IEEE International Conference on Data Mining, ICDM : 1179-1188. ScholarBank@NUS Repository. https://doi.org/10.1109/ICDMW.2010.64
dc.identifier.isbn9780769542577
dc.identifier.issn15504786
dc.identifier.urihttp://scholarbank.nus.edu.sg/handle/10635/43314
dc.description.abstractRecently there is a line of research work proposing to employ Spectral Clustering (SC) to segment (group)1\ high-dimensional structural data such as those (approximately) lying on subspaces2 or low-dimensional manifolds. By learning the affinity matrix in the form of sparse reconstruction, techniques proposed in this vein often considerably boost the performance in subspace settings where traditional SC can fail. Despite the success, there are fundamental problems that have been left unsolved: the spectrum property of the learned affinity matrix cannot be gauged in advance, and there is often one ugly symmetrization step that post-processes the affinity for SC input. Hence we advocate to enforce the symmetric positive semi definite constraint explicitly during learning (Low-Rank Representation with Positive Semi Definite constraint, or LRR-PSD), and show that factually it can be solved in an exquisite scheme efficiently instead of general-purpose SDP solvers that usually scale up poorly. We provide rigorous mathematical derivations to show that, in its canonical form, LRR-PSD is equivalent to the recently proposed Low-Rank Representation (LRR) scheme[1], and hence offer theoretic and practical insights to both LRR-PSD and LRR, inviting future research. As per the computational cost, our proposal is at most comparable to that of LRR, if not less. We validate our theoretic analysis and optimization scheme by experiments on both synthetic and real data sets. © 2010 IEEE.
dc.description.urihttp://libproxy1.nus.edu.sg/login?url=http://dx.doi.org/10.1109/ICDMW.2010.64
dc.sourceScopus
dc.subjectAffinity matrix learning
dc.subjectEigenvalue thresholding
dc.subjectRank minimization
dc.subjectRobust estimation
dc.subjectSpectral clustering
dc.typeConference Paper
dc.contributor.departmentCOMPUTATIONAL SCIENCE
dc.contributor.departmentINTERACTIVE & DIGITAL MEDIA INSTITUTE
dc.contributor.departmentELECTRICAL & COMPUTER ENGINEERING
dc.description.doi10.1109/ICDMW.2010.64
dc.description.sourcetitleProceedings - IEEE International Conference on Data Mining, ICDM
dc.description.page1179-1188
dc.identifier.isiutNOT_IN_WOS
Appears in Collections:Staff Publications

Show simple item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.