Please use this identifier to cite or link to this item: https://doi.org/10.1016/S0925-2312(02)00601-X
Title: Evaluation of simple performance measures for tuning SVM hyperparameters
Authors: Duan, K.
Keerthi, S.S. 
Poo, A.N. 
Keywords: Generalization error bound
Model selection
SVM
Issue Date: Apr-2003
Source: Duan, K., Keerthi, S.S., Poo, A.N. (2003-04). Evaluation of simple performance measures for tuning SVM hyperparameters. Neurocomputing 51 : 41-59. ScholarBank@NUS Repository. https://doi.org/10.1016/S0925-2312(02)00601-X
Abstract: Choosing optimal hyperparameter values for support vector machines is an important step in SVM design. This is usually done by minimizing either an estimate of generalization error or some other related performance measure. In this paper, we empirically study the usefulness of several simple performance measures that are inexpensive to compute (in the sense that they do not require expensive matrix operations involving the kernel matrix). The results point out which of these measures are adequate functionals for tuning SVM hyperparameters. For SVMs with L1 soft-margin formulation, none of the simple measures yields a performance uniformly as good as k-fold cross validation; Joachims' Xi-Alpha bound and the GACV of Wahba et al. come next and perform reasonably well. For SVMs with L2 soft-margin formulation, the radius margin bound gives a very good prediction of optimal hyperparameter values. © 2002 Elsevier Science B.V. All rights reserved.
Source Title: Neurocomputing
URI: http://scholarbank.nus.edu.sg/handle/10635/60217
ISSN: 09252312
DOI: 10.1016/S0925-2312(02)00601-X
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

SCOPUSTM   
Citations

350
checked on Dec 7, 2017

WEB OF SCIENCETM
Citations

282
checked on Nov 23, 2017

Page view(s)

47
checked on Dec 11, 2017

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.