Please use this identifier to cite or link to this item:
https://doi.org/10.1162/089976603762553013
Title: | SMO algorithm for least-squares SVM formulations | Authors: | Keerthi, S.S. Shevade, S.K. |
Issue Date: | Feb-2003 | Citation: | Keerthi, S.S., Shevade, S.K. (2003-02). SMO algorithm for least-squares SVM formulations. Neural Computation 15 (2) : 487-507. ScholarBank@NUS Repository. https://doi.org/10.1162/089976603762553013 | Abstract: | This article extends the well-known SMO algorithm of support vector machines (SVMs) to least-squares SVM formulations that include LS-SVM classification, kernel ridge regression, and a particular form of regularized kernel Fisher discriminant. The algorithm is shown to be asymptotically convergent. It is also extremely easy to implement. Computational experiments show that the algorithm is fast and scales efficiently (quadratically) as a function of the number of examples. | Source Title: | Neural Computation | URI: | http://scholarbank.nus.edu.sg/handle/10635/61328 | ISSN: | 08997667 | DOI: | 10.1162/089976603762553013 |
Appears in Collections: | Staff Publications |
Show full item record
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.