Please use this identifier to cite or link to this item: https://doi.org/10.1109/TNN.2006.875989
Title: Parallel sequential minimal optimization for the training of support vector machines
Authors: Cao, L.J.
Keerthi, S.S. 
Ong, C.-J. 
Zhang, J.Q.
Periyathamby, U.
Fu, X.J.
Lee, H.P.
Keywords: Message passing interface (MPI)
Parallel algorithm
Sequential minimal optimization (SMO)
Support vector machine (SVM)
Issue Date: Jul-2006
Source: Cao, L.J., Keerthi, S.S., Ong, C.-J., Zhang, J.Q., Periyathamby, U., Fu, X.J., Lee, H.P. (2006-07). Parallel sequential minimal optimization for the training of support vector machines. IEEE Transactions on Neural Networks 17 (4) : 1039-1049. ScholarBank@NUS Repository. https://doi.org/10.1109/TNN.2006.875989
Abstract: Sequential minimal optimization (SMO) is one popular algorithm for training support vector machine (SVM), but it still requires a large amount of computation time for solving large size problems. This paper proposes one parallel implementation of SMO for training SVM. The parallel SMO is developed using message passing interface (MPI). Specifically, the parallel SMO first partitions the entire training data set into smaller subsets and then simultaneously runs multiple CPU processors to deal with each of the partitioned data sets. Experiments show that there is great speedup on the adult data set and the Mixing National Institute of Standard and Technology (MNIST) data set when many processors are used. There are also satisfactory results on the Web data set. © 2006 IEEE.
Source Title: IEEE Transactions on Neural Networks
URI: http://scholarbank.nus.edu.sg/handle/10635/61045
ISSN: 10459227
DOI: 10.1109/TNN.2006.875989
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

SCOPUSTM   
Citations

92
checked on Dec 13, 2017

WEB OF SCIENCETM
Citations

69
checked on Dec 13, 2017

Page view(s)

13
checked on Dec 9, 2017

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.