Please use this identifier to cite or link to this item:
Title: A comparison of PCA, KPCA and ICA for dimensionality reduction in support vector machine
Authors: Cao, L.J.
Chua, K.S. 
Chong, W.K.
Lee, H.P.
Gu, Q.M.
Keywords: Independent component analysis
Kernel principal component analysis
Principal component analysis
Support vector machines
Issue Date: Sep-2003
Source: Cao, L.J., Chua, K.S., Chong, W.K., Lee, H.P., Gu, Q.M. (2003-09). A comparison of PCA, KPCA and ICA for dimensionality reduction in support vector machine. Neurocomputing 55 (1-2) : 321-336. ScholarBank@NUS Repository.
Abstract: Recently, support vector machine (SVM) has become a popular tool in time series forecasting. In developing a successful SVM forecastor, the first step is feature extraction. This paper proposes the applications of principal component analysis (PCA), kernel principal component analysis (KPCA) and independent component analysis (ICA) to SVM for feature extraction. PCA linearly transforms the original inputs into new uncorrelated features. KPCA is a nonlinear PCA developed by using the kernel method. In ICA, the original inputs are linearly transformed into features which are mutually statistically independent. By examining the sunspot data, Santa Fe data set A and five real futures contracts, the experiment shows that SVM by feature extraction using PCA, KPCA or ICA can perform better than that without feature extraction. Furthermore, among the three methods, there is the best performance in KPCA feature extraction, followed by ICA feature extraction. © 2003 Elsevier B.V. All rights reserved.
Source Title: Neurocomputing
ISSN: 09252312
DOI: 10.1016/S0925-2312(03)00433-8
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.


checked on Mar 14, 2018


checked on Mar 14, 2018

Page view(s)

checked on Mar 11, 2018

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.