Please use this identifier to cite or link to this item: https://doi.org/10.1162/089976603762553013
Title: SMO algorithm for least-squares SVM formulations
Authors: Keerthi, S.S. 
Shevade, S.K.
Issue Date: Feb-2003
Source: Keerthi, S.S., Shevade, S.K. (2003-02). SMO algorithm for least-squares SVM formulations. Neural Computation 15 (2) : 487-507. ScholarBank@NUS Repository. https://doi.org/10.1162/089976603762553013
Abstract: This article extends the well-known SMO algorithm of support vector machines (SVMs) to least-squares SVM formulations that include LS-SVM classification, kernel ridge regression, and a particular form of regularized kernel Fisher discriminant. The algorithm is shown to be asymptotically convergent. It is also extremely easy to implement. Computational experiments show that the algorithm is fast and scales efficiently (quadratically) as a function of the number of examples.
Source Title: Neural Computation
URI: http://scholarbank.nus.edu.sg/handle/10635/61328
ISSN: 08997667
DOI: 10.1162/089976603762553013
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

SCOPUSTM   
Citations

88
checked on Dec 7, 2017

WEB OF SCIENCETM
Citations

60
checked on Nov 22, 2017

Page view(s)

38
checked on Dec 11, 2017

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.