Please use this identifier to cite or link to this item:
https://doi.org/10.1007/s10589-009-9251-8
DC Field | Value | |
---|---|---|
dc.title | A coordinate gradient descent method for l1-regularized convex minimization | |
dc.contributor.author | Yun, S. | |
dc.contributor.author | Toh, K.-C. | |
dc.date.accessioned | 2014-12-02T08:39:20Z | |
dc.date.available | 2014-12-02T08:39:20Z | |
dc.date.issued | 2011-03 | |
dc.identifier.citation | Yun, S., Toh, K.-C. (2011-03). A coordinate gradient descent method for l1-regularized convex minimization. Computational Optimization and Applications 48 (2) : 273-307. ScholarBank@NUS Repository. https://doi.org/10.1007/s10589-009-9251-8 | |
dc.identifier.issn | 09266003 | |
dc.identifier.uri | http://scholarbank.nus.edu.sg/handle/10635/114661 | |
dc.description.abstract | In applications such as signal processing and statistics, many problems involve finding sparse solutions to under-determined linear systems of equations. These problems can be formulated as a structured nonsmooth optimization problems, i.e., the problem of minimizing l1regularized linear least squares problems. In this paper, we propose a block coordinate gradient descent method (abbreviated as CGD) to solve the more general l 1regularized convex minimization problems, i.e., the problem of minimizing an l1regularized convex smooth function. We establish a Q-linear convergence rate for our method when the coordinate block is chosen by a Gauss-Southwell-type rule to ensure sufficient descent. We propose efficient implementations of the CGD method and report numerical results for solving large-scale l1regularized linear least squares problems arising in compressed sensing and image deconvolution as well as large-scale l 1regularized logistic regression problems for feature selection in data classification. Comparison with several state-of-the-Art algorithms specifically designed for solving large-scale l1regularized linear least squares or logistic regression problems suggests that an efficiently implemented CGD method may outperform these algorithms despite the fact that the CGD method is not specifically designed just to solve these special classes of problems. © Springer Science+Business Media, LLC 2009. | |
dc.description.uri | http://libproxy1.nus.edu.sg/login?url=http://dx.doi.org/10.1007/s10589-009-9251-8 | |
dc.source | Scopus | |
dc.subject | Compressed sensing | |
dc.subject | Convex optimization | |
dc.subject | Coordinate gradient descent | |
dc.subject | Image deconvolution | |
dc.subject | L1- Regularization | |
dc.subject | Linear least squares | |
dc.subject | Logistic regression | |
dc.subject | Q-linear convergence | |
dc.type | Conference Paper | |
dc.contributor.department | SINGAPORE-MIT ALLIANCE | |
dc.description.doi | 10.1007/s10589-009-9251-8 | |
dc.description.sourcetitle | Computational Optimization and Applications | |
dc.description.volume | 48 | |
dc.description.issue | 2 | |
dc.description.page | 273-307 | |
dc.description.coden | CPPPE | |
dc.identifier.isiut | 000288551600006 | |
Appears in Collections: | Staff Publications |
Show simple item record
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.