Please use this identifier to cite or link to this item: https://scholarbank.nus.edu.sg/handle/10635/78215
Title: Learning with invariance via linear functionals on reproducing Kernel Hilbert space
Authors: Zhang, X.
Lee, W.S. 
The, Y.W.
Issue Date: 2013
Citation: Zhang, X.,Lee, W.S.,The, Y.W. (2013). Learning with invariance via linear functionals on reproducing Kernel Hilbert space. Advances in Neural Information Processing Systems. ScholarBank@NUS Repository.
Abstract: Incorporating invariance information is important for many learning problems. To exploit invariances, most existing methods resort to approximations that either lead to expensive optimization problems such as semi-definite programming, or rely on separation oracles to retain tractability. Some methods further limit the space of functions and settle for non-convex models. In this paper, we propose a framework for learning in reproducing kernel Hilbert spaces (RKHS) using local invariances that explicitly characterize the behavior of the target function around data instances. These invariances are compactly encoded as linear functionals whose value are penalized by some loss function. Based on a representer theorem that we establish, our formulation can be efficiently optimized via a convex program. For the representer theorem to hold, the linear functionals are required to be bounded in the RKHS, and we show that this is true for a variety of commonly used RKHS and invariances. Experiments on learning with unlabeled data and transform invariances show that the proposed method yields better or similar results compared with the state of the art.
Source Title: Advances in Neural Information Processing Systems
URI: http://scholarbank.nus.edu.sg/handle/10635/78215
ISSN: 10495258
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.