Please use this identifier to cite or link to this item:
Title: Bayesian support vector regression using a unified loss function
Authors: Chu, W.
Keerthi, S.S. 
Ong, C.J. 
Keywords: Automatic relevance determination
Bayesian inference
Gaussian processes
Model selection
Nonquadratic loss function
Support vector regression
Issue Date: Jan-2004
Citation: Chu, W., Keerthi, S.S., Ong, C.J. (2004-01). Bayesian support vector regression using a unified loss function. IEEE Transactions on Neural Networks 15 (1) : 29-44. ScholarBank@NUS Repository.
Abstract: In this paper, we use a unified loss function, called the soft insensitive loss function, for Bayesian support vector regression. We follow standard Gaussian processes for regression to set up the Bayesian framework, in which the unified loss function is used in the likelihood evaluation. Under this framework, the maximum a posteriori estimate of the function values corresponds to the solution of an extended support vector regression problem. The overall approach has the merits of support vector regression such as convex quadratic programming and sparsity in solution representation. It also has the advantages of Bayesian methods for model adaptation and error bars of its predictions. Experimental results on simulated and real-world data sets indicate that the approach works well even on large data sets.
Source Title: IEEE Transactions on Neural Networks
ISSN: 10459227
DOI: 10.1109/TNN.2003.820830
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.


checked on Jul 9, 2020


checked on Jul 9, 2020

Page view(s)

checked on Jun 29, 2020

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.