Please use this identifier to cite or link to this item: https://scholarbank.nus.edu.sg/handle/10635/242648
Title: ROBUSTIFYING SEQUENTIAL NEURAL POSTERIOR ESTIMATION
Authors: AMIT SHARMA
ORCID iD:   orcid.org/0009-0009-7336-7376
Keywords: ABC, Bayesian, SBI, Neural, Sequential, SNPE
Issue Date: 17-Mar-2023
Citation: AMIT SHARMA (2023-03-17). ROBUSTIFYING SEQUENTIAL NEURAL POSTERIOR ESTIMATION. ScholarBank@NUS Repository.
Abstract: For complex statistical models, calculation of the likelihood function can sometimes be impractical. However, if data can be simulated from the model for any value of the model parameter, then it is possible to perform Bayesian inference using likelihood-free inference methods which avoid likelihood calculations. Because likelihood-free methods are used in complex situations where model understanding is difficult, there is often a risk of model misspecification that can negatively impact model-based inference and decision-making. This means it is important to develop likelihood-free methods which are robust to model misspecification. There has been much recent work on dealing with model misspecification for traditional likelihoodfree inference approaches such as approximate Bayesian computation (ABC) and synthetic likelihood, but only limited discussion of the problem in the context of neural methods which are becoming more popular. The purpose of this thesis is to investigate ways to robustify likelihood-free inference for sequential neural posterior estimation (SNPE) algorithms. A new approach to this problem is suggested, and its performance investigated in some benchmark real and simulated examples.
URI: https://scholarbank.nus.edu.sg/handle/10635/242648
Appears in Collections:Master's Theses (Open)

Show full item record
Files in This Item:
File Description SizeFormatAccess SettingsVersion 
Amit_Sharma_Thesis_Submission_v2.0.pdf640.96 kBAdobe PDF

OPEN

NoneView/Download

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.