Please use this identifier to cite or link to this item: https://doi.org/10.1016/j.neucom.2010.11.001
Title: Non-uniform multiple kernel learning with cluster-based gating functions
Authors: Mu, Y. 
Zhou, B.
Keywords: Graph embedding
Kernel based learning
Multi-kernel learning
Issue Date: Mar-2011
Citation: Mu, Y., Zhou, B. (2011-03). Non-uniform multiple kernel learning with cluster-based gating functions. Neurocomputing 74 (7) : 1095-1101. ScholarBank@NUS Repository. https://doi.org/10.1016/j.neucom.2010.11.001
Abstract: Recently, multiple kernel learning (MKL) has gained increasing attention due to its empirical superiority over traditional single kernel based methods. However, most of state-of-the-art MKL methods are "uniform" in the sense that the relative weights of kernels keep fixed among all data. Here we propose a "non-uniform" MKL method with a data-dependent gating mechanism, i.e., adaptively determine the kernel weights for the samples. We utilize a soft clustering algorithm and then tune the weight for each cluster under the graph embedding (GE) framework. The idea of exploiting cluster structures is based on the observation that data from the same cluster tend to perform consistently, which thus increases the resistance to noises and results in more reliable estimate. Moreover, it is computationally simple to handle out-of-sample data, whose implicit RKHS representations are modulated by the posterior to each cluster. Quantitative studies between the proposed method and some representative MKL methods are conducted on both synthetic and widely used public data sets. The experimental results well validate its superiorities. © 2010 Elsevier B.V.
Source Title: Neurocomputing
URI: http://scholarbank.nus.edu.sg/handle/10635/56828
ISSN: 09252312
DOI: 10.1016/j.neucom.2010.11.001
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.