Please use this identifier to cite or link to this item: http://scholarbank.nus.edu.sg/handle/10635/16272
Title: Dimensionality reduction by kernel CCA in reproducing kernel hilbert spaces
Authors: ZHU XIAOFENG
Keywords: Dimensionality Reduction, Kernel Canonical Correlation Analysis, Reproducing Kernel Hilbert Spaces, Principal Component Analysis, Kernel Methods,
Issue Date: 1-Jul-2009
Source: ZHU XIAOFENG (2009-07-01). Dimensionality reduction by kernel CCA in reproducing kernel hilbert spaces. ScholarBank@NUS Repository.
Abstract: In the thesis, we employ a multi-modal method (i.e., kernel canonical correlation analysis) named RKCCA to implement dimensionality reduction for high dimensional data. Our RKCCA method first maps the original data into the Reproducing Kernel Hilbert Space (RKHS) by explicit kernel functions, whereas the traditional KCCA (referred to as spectrum KCCA) method projects the input into high dimensional Hilbert space by implicit kernel functions. This makes the RKCCA method more suitable for theoretical development. Furthermore, we prove the equivalence between our RKCCA and spectrum KCCA. In RKHS, we prove that RKCCA method can be decomposed into two separate steps, i.e., principal component analysis (PCA) followed by canonical correlation analysis (CCA). We also prove that the rule can be preserved for implementing dimensionality reduction in RKHS. Experimental results on real-world datasets show the presented method yields better performance than the sate-of-the-art algorithms in terms of classification accuracy and the effect of dimensionality reduction.
URI: http://scholarbank.nus.edu.sg/handle/10635/16272
Appears in Collections:Master's Theses (Open)

Show full item record
Files in This Item:
File Description SizeFormatAccess SettingsVersion 
ZhuXF.pdf289.18 kBAdobe PDF

OPEN

NoneView/Download

Page view(s)

321
checked on Dec 11, 2017

Download(s)

276
checked on Dec 11, 2017

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.