Please use this identifier to cite or link to this item:
|Title:||ONLINE GAUSSIAN PROCESS FILTERING FOR PERSISTENT ROBOT LOCALIZATION WITH ARBITRARY SENSOR MODALITIES||Authors:||XU NUO||Keywords:||Localization,Gaussian Process,Robot,Filtering||Issue Date:||19-Aug-2016||Citation:||XU NUO (2016-08-19). ONLINE GAUSSIAN PROCESS FILTERING FOR PERSISTENT ROBOT LOCALIZATION WITH ARBITRARY SENSOR MODALITIES. ScholarBank@NUS Repository.||Abstract:||Robot localization has been recognized as one of the most fundamental problems in mobile robotics. This thesis describes a persistent, general-purpose localization framework which can efficiently and effectively localize a robot under a dynamically changing environment with multiple sensor modalities. The localization problem is usually resolved by making use of a probabilistic state estimation framework known as the Bayes filter in the robotics community. Bayes filter repeatedly updates the belief of a robot’s location/state by assimilating the field measurements taken during the robot’s exploration through its observation model. To preserve time efficiency, the observation model is usually assumed to be well-trained with prior training data, which is not aways feasible for some sensor types due to a limited sampling budget, changing environments, etc. To resolve this problem, a GP-Localize algorithm is proposed which can utilize past observations during localization to learn the observation model online and achieve constant time and memory in the size of the data per localization step. We empirically demonstrate that GP-Localize outperforms existing Gaussian process localization algorithms in terms of localization performance. In practice, measurements from certain type of sensors are not informative if they are very similar across a large space (exceedingly high spatial correlation) or suffer from the perceptual aliasing problem where the same measurements are observed in two different locations. Thus, it is potentially helpful to fuse multiple sensor measurements together to resolve these issues since there is a higher chance that some types of measurements can disambiguate a robot’s location. To demonstrate the feasibility of extending the range of environments and application domains for robot localization, a CMOGP-Localize algorithm is presented which can fuse multiple types of sensory measurements and efficiently exploit correlation among them through our novel online sparse convolutional multi-output Gaussian process (CMOGP). A theoretically analysis demonstrates the equivalence of our online sparse CMOGP to the online learning variant of our proposed offline generalized partially independent training conditional approximation. Through three real-world robot experiments we demonstrate the scalability of our CMOGP-Localize with constant time and memory in the size of the data per localization step and show that the performance of CMOGP-Localize significantly improves by exploiting the correlation among multiple types of measurements. GP-Localize and CMOGP-Localize scale very well in the size of data per time step as they summarize and assimilate the data into a particular structure for efficient prediction. However, this high scalability comes with the price of information loss in the training data of the observation model. To resolve this problem, we integrate a local augmentation approximation in both algorithms which can significantly improve the performance. In particular, we develop a more sophisticated online sparse GP/CMOGP model which exploits the local observations together with the summary information used in GP-Localize and CMOGP-Localization and prove that our GP-Localize and CMOGP-Localize algorithm still incur constant time and memory after incorporating the local augmentation with performance guarantee. Real-world robot experiments exhibit the significant improvement of performance and robustness in GP-Localize and CMOGP-Localize after incorporating local augmentation.||URI:||http://scholarbank.nus.edu.sg/handle/10635/135600|
|Appears in Collections:||Ph.D Theses (Open)|
Show full item record
Files in This Item:
|XuN.pdf||7.62 MB||Adobe PDF|
checked on May 23, 2019
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.