Please use this identifier to cite or link to this item:
https://scholarbank.nus.edu.sg/handle/10635/86063
Title: | Provable Subspace Clustering: When LRR meets SSC | Authors: | Wang, Y.-X. Xu, H. Leng, C. |
Issue Date: | 2013 | Citation: | Wang, Y.-X.,Xu, H.,Leng, C. (2013). Provable Subspace Clustering: When LRR meets SSC. Advances in Neural Information Processing Systems. ScholarBank@NUS Repository. | Abstract: | Sparse Subspace Clustering (SSC) and Low-Rank Representation (LRR) are both considered as the state-of-the-art methods for subspace clustering. The two methods are fundamentally similar in that both are convex optimizations exploiting the intuition of "Self-Expressiveness". The main difference is that SSC minimizes the vector 1 norm of the representation matrix to induce sparsity while LRR minimizes nuclear norm (aka trace norm) to promote a low-rank structure. Because the representation matrix is often simultaneously sparse and low-rank, we propose a new algorithm, termed Low-Rank Sparse Subspace Clustering (LRSSC), by combining SSC and LRR, and develops theoretical guarantees of when the algorithm succeeds. The results reveal interesting insights into the strength and weakness of SSC and LRR and demonstrate how LRSSC can take the advantages of both methods in preserving the "Self- Expressiveness Property" and "Graph Connectivity" at the same time. | Source Title: | Advances in Neural Information Processing Systems | URI: | http://scholarbank.nus.edu.sg/handle/10635/86063 | ISSN: | 10495258 |
Appears in Collections: | Staff Publications |
Show full item record
Files in This Item:
There are no files associated with this item.
Google ScholarTM
Check
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.