Please use this identifier to cite or link to this item: http://scholarbank.nus.edu.sg/handle/10635/36112
Title: Kernel methods for the incorporation of prior-knowledge into support vector machines
Authors: VEILLARD ANTOINE PAUL MITSUMASA
Keywords: machine learning, support vector machine, kernel methods, prior-knowledge, breast cancer
Issue Date: 16-Aug-2012
Source: VEILLARD ANTOINE PAUL MITSUMASA (2012-08-16). Kernel methods for the incorporation of prior-knowledge into support vector machines. ScholarBank@NUS Repository.
Abstract: SVMs are a class of state-of-the-art supervised learning algorithm implementing the structural risk minimization principle first proposed by the mathematician Vladimir N. Vapnik. In combination with the general purpose RBF kernel, it has been applied to successfully solve many complex, real-life problems. However, the required amount of training data can be very high making the SVM option unavailable in many practical situations. Often, prior-knowledge on the task is available and could be used together with labeled data for training. This requires specific methods to be developed since by its design, the SVM takes only labeled data points as input. The knowledge-enhanced RBF (KE-RBF) framework is a set of original kernel methods for the incorporation of prior-knowledge into SVMs. It comprises 3 new kernels based on transformations of the RBF kernel widely used in machine learning, and gives systematic methods for the incorporation of properties specific to the problem while retaining the versatility making the popularity of the RBF kernel.
URI: http://scholarbank.nus.edu.sg/handle/10635/36112
Appears in Collections:Ph.D Theses (Open)

Show full item record
Files in This Item:
File Description SizeFormatAccess SettingsVersion 
full_thesis.pdf22.89 MBAdobe PDF

OPEN

NoneView/Download

Page view(s)

183
checked on Dec 11, 2017

Download(s)

108
checked on Dec 11, 2017

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.