Please use this identifier to cite or link to this item:
DC FieldValue
dc.titleGradient-based kernel method for feature extraction and variable selection
dc.contributor.authorFukumizu, K.
dc.contributor.authorLeng, C.
dc.identifier.citationFukumizu, K.,Leng, C. (2012). Gradient-based kernel method for feature extraction and variable selection. Advances in Neural Information Processing Systems 3 : 2114-2122. ScholarBank@NUS Repository.
dc.description.abstractWe propose a novel kernel approach to dimension reduction for supervised learning: feature extraction and variable selection; the former constructs a small number of features from predictors, and the latter finds a subset of predictors. First, a method of linear feature extraction is proposed using the gradient of regression function, based on the recent development of the kernel method. In comparison with other existing methods, the proposed one has wide applicability without strong assumptions on the regressor or type of variables, and uses computationally simple eigendecomposition, thus applicable to large data sets. Second, in combination of a sparse penalty, the method is extended to variable selection, following the approach by Chen et al. [2]. Experimental results show that the proposed methods successfully find effective features and variables without parametric models.
dc.typeConference Paper
dc.contributor.departmentSTATISTICS & APPLIED PROBABILITY
dc.description.sourcetitleAdvances in Neural Information Processing Systems
Appears in Collections:Staff Publications

Show simple item record
Files in This Item:
There are no files associated with this item.

Page view(s)

checked on May 12, 2022

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.