Please use this identifier to cite or link to this item: https://scholarbank.nus.edu.sg/handle/10635/171654
Title: COMPUTATIONAL METHODS FOR MULTIPLE TARGET TRACKING
Authors: ZENG JIAJIE
Keywords: Multiple Target Tracking, MCMC, Bayesian Inference, Parameter Estimation, Filtering, Smoothing
Issue Date: 23-Jan-2020
Citation: ZENG JIAJIE (2020-01-23). COMPUTATIONAL METHODS FOR MULTIPLE TARGET TRACKING. ScholarBank@NUS Repository.
Abstract: This thesis provides comprehensive solutions to the Multi-target tracking (MTT) problem. We first build a Markov Chain Monte Carlo (MCMC) method on the set of data associations, with the objective of identifying the optimal association. The novelty in the proposed MCMC algorithm is that an approximate multi-target filtering method is utilized to propose the candidate associations. We then extend the MCMC method to a larger space that includes the objects' trajectories. Since the dimensions of targets' states changes in the MCMC proposal, reversible jump MCMC (RJ-MCMC) is applied for estimating the distribution on the extended space. For this part, we divide the discussion into two categories: linear and non-linear model. In the linear model, the close form of the posterior distribution can be derived, so we can directly sample the trajectory by using backwards sampling. However, in the non-linear case, the closed form of posterior is intractable and conditional particle filters are used to target the correct distribution. Finally, we discuss the parameter estimation in our model using the Gibbs sampling method, which includes all the static parameters of the model. We further solve a widespread application problem termed as sensor registration based on our method.
URI: https://scholarbank.nus.edu.sg/handle/10635/171654
Appears in Collections:Ph.D Theses (Open)

Show full item record
Files in This Item:
File Description SizeFormatAccess SettingsVersion 
Thesis_final_ZengJiajie.pdf4.89 MBAdobe PDF

OPEN

NoneView/Download

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.