Please use this identifier to cite or link to this item:
|Title:||A LEARNABLE BEHAVIOURAL MODEL FOR EMERGENCY EVACUATIONS||Authors:||MUHAMMAD SHALIHIN BIN OTHMAN||ORCID iD:||orcid.org/0000-0003-0069-2329||Keywords:||Modelling and Simulation, Neural Networks, Conscious Movement Model, Behavioural Modelling, Behavioural Learning, Computer Vision||Issue Date:||16-Jul-2021||Citation:||MUHAMMAD SHALIHIN BIN OTHMAN (2021-07-16). A LEARNABLE BEHAVIOURAL MODEL FOR EMERGENCY EVACUATIONS. ScholarBank@NUS Repository.||Abstract:||Modelling and simulation of crisis is a research topic that is of practical importance to the world at large. Various research has sought to optimize its performance in terms of accuracy and speed to replicate close to real-life scenarios in real-time. Recent work has also looked into introducing intelligence into Multi-Agent Systems (MAS) for crisis simulation by incorporating psychological behaviours from the social sciences and using data-driven machine learning models with predictive capabilities. Through comprehensive background research and important takeaways from related work, we identified existing issues in crisis simulation and attempt to bridge pertinent knowledge gaps. This thesis will delineate the latest technologies that are available today and the possibilities of improving crisis simulation, focusing particularly on emergency evacuations. As recent advancements in machine learning have been noticeably increasing over the years, numerous attempts to integrate such methodologies into various disciplines have also been made. We investigated state-of-the-art techniques in machine learning to examine methods that can empower a behavioural model so as to improve the level of realism for human behaviour and reactions during emergency situations. We explored the social sciences behind human characteristics and their reactions to changing environments, and how their memory and attention to surrounding forces can affect their decision-making process. Our main contribution will present a novel Conscious Movement Model (CMM) to dynamically reflect human behaviours in and out of an emergency situation. To supplement the Conscious Movement Model (CMM), we attached a Conscious Movement Memory-Attention (CMMA) mechanism to map a pedestrian’s attention to their surroundings and how their prior decisions can affect their next move. Subsequently, we proposed an efficient methodology for training the CMMA mechanism through pedestrian tracking from video footage using the CMM itself. The method proposed will allow for continuous and incremental learning of the CMMA model with real-life video footage. The trained behaviour model can then be attached to intelligent agents for crisis simulation to achieve realistic outcomes. Finally, we introduced a simulation framework to automate strategy management and planning for different scenarios in emergency evacuations. We designed the framework architecture to leverage the proposed CMM for improved realism of human behaviour. Based on pre-set operational rules, we presented a single-objective method to generate prescriptive analytics of effective strategy options. We evaluated our work through two case studies of a classroom and a theatre evacuation. In essence, this interdisciplinary research work produced an effective simulation framework for crisis management, with a focus on pedestrians evacuating in emergencies, to derive optimal strategies for authorities to carry out rescue and evacuation operations. To the best of our knowledge, this thesis is the first to explore the possibility of integrating the learning of a pedestrian behavioural model from real-life video and reflecting realistic dynamic behaviours directly into a simulation model, with the aim of producing robust recommendations for strategies to adopt in emergency evacuation operations.||URI:||https://scholarbank.nus.edu.sg/handle/10635/215058|
|Appears in Collections:||Ph.D Theses (Open)|
Show full item record
Files in This Item:
|Thesis_FINAL.pdf||27.68 MB||Adobe PDF|
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.