Please use this identifier to cite or link to this item: https://doi.org/10.1162/NECO_a_00369
Title: Active subspace: Toward scalable low-rank learning
Authors: Liu, G.
Yan, S. 
Issue Date: 2012
Citation: Liu, G., Yan, S. (2012). Active subspace: Toward scalable low-rank learning. Neural Computation 24 (12) : 3371-3394. ScholarBank@NUS Repository. https://doi.org/10.1162/NECO_a_00369
Abstract: We address the scalability issues in low-rank matrix learning problems. Usually these problems resort to solving nuclear norm regularized optimization problems (NNROPs), which often suffer from high computational complexities if based on existing solvers, especially in large-scale settings. Based on the fact that the optimal solution matrix to an NNROP is often low rank, we revisit the classic mechanism of low-rank matrix factorization, based on which we present an active subspace algorithm for efficiently solving NNROPs by transforming large-scale NNROPs into small-scale problems. The transformation is achieved by factorizing the large solution matrix into the product of a small orthonormal matrix (active subspace) and another small matrix. Although such a transformation generally leads to nonconvex problems, we show that a suboptimal solution can be found by the augmented Lagrange alternating direction method. For the robust PCA (RPCA) (Candés, Li, Ma, & Wright, 2009) problem, a typical example of NNROPs, theoretical results verify the suboptimality of the solution produced by our algorithm. For the general NNROPs, we empirically show that our algorithm significantly reduces the computational complexity without loss of optimality. © 2012 Massachusetts Institute of Technology.
Source Title: Neural Computation
URI: http://scholarbank.nus.edu.sg/handle/10635/54880
ISSN: 08997667
DOI: 10.1162/NECO_a_00369
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.