Please use this identifier to cite or link to this item:
https://scholarbank.nus.edu.sg/handle/10635/231424
Title: | DEEP GENERATIVE MODELING FOR IMAGES AND TIME SERIES | Authors: | ABDUL FATIR ANSARI | ORCID iD: | orcid.org/0000-0003-0263-9142 | Keywords: | Generative Models, Generative Adversarial Networks, Variational Inference, Gradient Flows, State Space Models, Machine Learning | Issue Date: | 23-Jan-2022 | Citation: | ABDUL FATIR ANSARI (2022-01-23). DEEP GENERATIVE MODELING FOR IMAGES AND TIME SERIES. ScholarBank@NUS Repository. | Abstract: | Deep generative models (DGMs) have excelled at modeling high-dimensional data. In this thesis, we propose new DGMs and methods, focusing on image and time series data, that seek to address the research challenges associated with DGMs. We begin with DGMs for images, offering a fresh perspective on the problem of learning in generative adversarial networks (GANs) by formulating it as minimizing the distance between characteristic functions. Further, since sample quality is generally inconsistent for any given generative model, we introduce a technique that refines samples using the gradient flow of f-divergences. In the second part, we focus on DGMs for time series. We propose a flexible state space model parameterized by neural networks that is capable of identifying both state- and time-dependent switching dynamics in time series data. State-dependent switching is enabled by a recurrent state-to-switch connection and explicit duration count variables are used to improve the time-dependent switching behavior. | URI: | https://scholarbank.nus.edu.sg/handle/10635/231424 |
Appears in Collections: | Ph.D Theses (Open) |
Show full item record
Files in This Item:
File | Description | Size | Format | Access Settings | Version | |
---|---|---|---|---|---|---|
AnsariAF.pdf | 19.38 MB | Adobe PDF | OPEN | None | View/Download |
Google ScholarTM
Check
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.