Please use this identifier to cite or link to this item: http://scholarbank.nus.edu.sg/handle/10635/134437
Title: RISK-AVERSE AND AMBIGUITY-AVERSE MARKOV DECISION PROCESSES
Authors: YU PENGQIAN
Keywords: Markov decision processes, Parameter uncertainty, Risk measures, Dynamic programming, Approximation algorithms, Reinforcement learning
Issue Date: 8-Aug-2016
Source: YU PENGQIAN (2016-08-08). RISK-AVERSE AND AMBIGUITY-AVERSE MARKOV DECISION PROCESSES. ScholarBank@NUS Repository.
Abstract: Markov decision processes (MDPs) are powerful tools in planning tasks and sequential decision making problems. Many real-world decision problems involve uncertainty, either due to parameter uncertainty about the model or due to noise in the system dynamics. In this work, we investigate two approaches towards such uncertainty. The first approach deals with a MDP model with parameter uncertainty, termed ambiguity-averse (robust) MDPs. We generalize the distributionally robust MDPs to a more generic class. The second approach considers the static and dynamic risk measures of return, termed risk-averse MDPs. For static risk measures, we first solve the probabilistic goal MDPs approximately by using a machine-learning technique. We then provide a dynamic programming (DP) framework to solve risk-averse MDPs with general risk measures exactly, and further develop an approximate algorithm with the help of a central limit theorem. For dynamic risk measures, we extend the latest DP-based method, and propose simulation-based algorithms.
URI: http://scholarbank.nus.edu.sg/handle/10635/134437
Appears in Collections:Ph.D Theses (Open)

Show full item record
Files in This Item:
File Description SizeFormatAccess SettingsVersion 
YuPQ.pdf1.92 MBAdobe PDF

OPEN

NoneView/Download

Page view(s)

125
checked on Jan 14, 2018

Download(s)

125
checked on Jan 14, 2018

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.