Please use this identifier to cite or link to this item:
|Title:||Parallel Gaussian process regression with low-rank covariance matrix approximations||Authors:||Chen, J.
|Issue Date:||2013||Citation:||Chen, J.,Cao, N.,Low, K.H.,Ouyang, R.,Tan, C.K.-Y.,Jaillet, P. (2013). Parallel Gaussian process regression with low-rank covariance matrix approximations. Uncertainty in Artificial Intelligence - Proceedings of the 29th Conference, UAI 2013 : 152-161. ScholarBank@NUS Repository.||Abstract:||Gaussian processes (GP) are Bayesian non-parametric models that are widely used for probabilistic regression. Unfortunately, it cannot scale well with large data nor perform real-time predictions due to its cubic time cost in the data size. This paper presents two parallel GP regression methods that exploit low-rank covariance matrix approximations for distributing the computational load among parallel machines to achieve time efficiency and scalability. We theoretically guarantee the predictive performances of our proposed parallel GPs to be equivalent to that of some centralized approximate GP regression methods: The computation of their centralized counterparts can be distributed among parallel machines, hence achieving greater time efficiency and scalability. We analytically compare the properties of our parallel GPs such as time, space, and communication complexity. Empirical evaluation on two real-world datasets in a cluster of 20 computing nodes shows that our parallel GPs are significantly more time-efficient and scalable than their centralized counterparts and exact/full GP while achieving predictive performances comparable to full GP.||Source Title:||Uncertainty in Artificial Intelligence - Proceedings of the 29th Conference, UAI 2013||URI:||http://scholarbank.nus.edu.sg/handle/10635/78277|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Dec 29, 2019
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.