Please use this identifier to cite or link to this item:
https://doi.org/10.1137/110847081
DC Field | Value | |
---|---|---|
dc.title | An inexact accelerated proximal gradient method for large scale linearly constrained convex SDP | |
dc.contributor.author | Jiang, K. | |
dc.contributor.author | Sun, D. | |
dc.contributor.author | Toh, K.-C. | |
dc.date.accessioned | 2014-10-28T02:30:22Z | |
dc.date.available | 2014-10-28T02:30:22Z | |
dc.date.issued | 2012 | |
dc.identifier.citation | Jiang, K., Sun, D., Toh, K.-C. (2012). An inexact accelerated proximal gradient method for large scale linearly constrained convex SDP. SIAM Journal on Optimization 22 (3) : 1042-1064. ScholarBank@NUS Repository. https://doi.org/10.1137/110847081 | |
dc.identifier.issn | 10526234 | |
dc.identifier.uri | http://scholarbank.nus.edu.sg/handle/10635/102841 | |
dc.description.abstract | The accelerated proximal gradient (APG) method, first proposed by Nesterov for minimizing smooth convex functions, later extended by Beck and Teboulle to composite convex objective functions, and studied in a unifying manner by Tseng, has proven to be highly efficient in solving some classes of large scale structured convex optimization (possibly nonsmooth) problems, including nuclear norm minimization problems in matrix completion and l 1 minimization problems in compressed sensing. The method has superior worst-case iteration complexity over the classical projected gradient method and usually has good practical performance on problems with appropriate structures. In this paper, we extend the APG method to the inexact setting, where the subproblem in each iteration is solved only approximately, and show that it enjoys the same worst-case iteration complexity as the exact counterpart if the subproblems are progressively solved to sufficient accuracy. We apply our inexact APG method to solve large scale convex quadratic semidefinite programming (QSDP) problems of the form min{1/2〈x, Q(x)〉 + 〈c, x〉 | A (x) = b, x ≻ 0}, where Q,A are given linear maps and b, c are given data. The subproblem in each iteration is solved by a semismooth Newton-CG (SSNCG) method with warm-start using the iterate from the previous iteration. Our APG-SSNCG method is demonstrated to be efficient for QSDP problems whose positive semidefinite linear maps Q are highly ill-conditioned or rank deficient. © 2012 Society for Industrial and Applied Mathematics. | |
dc.description.uri | http://libproxy1.nus.edu.sg/login?url=http://dx.doi.org/10.1137/110847081 | |
dc.source | Scopus | |
dc.subject | Convex quadratic SDP | |
dc.subject | Inexact accelerated proximal gradient | |
dc.subject | Semismooth Newton-CG | |
dc.subject | Structured convex optimization | |
dc.type | Article | |
dc.contributor.department | MATHEMATICS | |
dc.description.doi | 10.1137/110847081 | |
dc.description.sourcetitle | SIAM Journal on Optimization | |
dc.description.volume | 22 | |
dc.description.issue | 3 | |
dc.description.page | 1042-1064 | |
dc.identifier.isiut | 000310214800016 | |
Appears in Collections: | Staff Publications |
Show simple item record
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.