Please use this identifier to cite or link to this item: https://doi.org/10.1109/TNN.2003.820830
Title: Bayesian support vector regression using a unified loss function
Authors: Chu, W.
Keerthi, S.S. 
Ong, C.J. 
Keywords: Automatic relevance determination
Bayesian inference
Gaussian processes
Model selection
Nonquadratic loss function
Support vector regression
Issue Date: Jan-2004
Citation: Chu, W., Keerthi, S.S., Ong, C.J. (2004-01). Bayesian support vector regression using a unified loss function. IEEE Transactions on Neural Networks 15 (1) : 29-44. ScholarBank@NUS Repository. https://doi.org/10.1109/TNN.2003.820830
Abstract: In this paper, we use a unified loss function, called the soft insensitive loss function, for Bayesian support vector regression. We follow standard Gaussian processes for regression to set up the Bayesian framework, in which the unified loss function is used in the likelihood evaluation. Under this framework, the maximum a posteriori estimate of the function values corresponds to the solution of an extended support vector regression problem. The overall approach has the merits of support vector regression such as convex quadratic programming and sparsity in solution representation. It also has the advantages of Bayesian methods for model adaptation and error bars of its predictions. Experimental results on simulated and real-world data sets indicate that the approach works well even on large data sets.
Source Title: IEEE Transactions on Neural Networks
URI: http://scholarbank.nus.edu.sg/handle/10635/59618
ISSN: 10459227
DOI: 10.1109/TNN.2003.820830
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

SCOPUSTM   
Citations

74
checked on Jul 11, 2018

WEB OF SCIENCETM
Citations

61
checked on Jun 20, 2018

Page view(s)

31
checked on Jun 1, 2018

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.