Please use this identifier to cite or link to this item:
Title: Asymptotic error bounds for kernel-based Nyström low-rank approximation matrices
Authors: Chang, L.-B.
Bai, Z. 
Huang, S.-Y.
Hwang, C.-R.
Keywords: Asymptotic error bound
Kernel Gram matrix
Nyström approximation
Spectrum decomposition
Wishart random matrix
Issue Date: Sep-2013
Citation: Chang, L.-B., Bai, Z., Huang, S.-Y., Hwang, C.-R. (2013-09). Asymptotic error bounds for kernel-based Nyström low-rank approximation matrices. Journal of Multivariate Analysis 120 : 102-119. ScholarBank@NUS Repository.
Abstract: Many kernel-based learning algorithms have the computational load scaled with the sample size n due to the column size of a full kernel Gram matrix K. This article considers the Nyström low-rank approximation. It uses a reduced kernel K̂, which is n × m, consisting of m columns (say columns i1, i2, ⋯ , i m) randomly drawn from K. This approximation takes the form K≈K̂U-1K̂T, where U is the reduced m × m matrix formed by rows i1, i2, ⋯ , i m of K̂. Often m is much smaller than the sample size n resulting in a thin rectangular reduced kernel, and it leads to learning algorithms scaled with the column size m. The quality of matrix approximations can be assessed by the closeness of their eigenvalues and eigenvectors. In this article, asymptotic error bounds on eigenvalues and eigenvectors are derived for the Nyström low-rank approximation matrix. © 2013 Elsevier Inc.
Source Title: Journal of Multivariate Analysis
ISSN: 0047259X
DOI: 10.1016/j.jmva.2013.05.006
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.


checked on Jul 10, 2020


checked on Jul 10, 2020

Page view(s)

checked on Jun 27, 2020

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.