Please use this identifier to cite or link to this item: https://scholarbank.nus.edu.sg/handle/10635/233976
Title: MACHINE LEARNING FOR CONSTITUTIVE MODELLING
Authors: DENG HAOXIANG
ORCID iD:   orcid.org/0000-0003-0344-008X
Keywords: Recurrent neural network, error correction method, knowledge transfer, computational homogenization, multi-scale modelling, history-dependent behavior
Issue Date: 12-Jul-2022
Citation: DENG HAOXIANG (2022-07-12). MACHINE LEARNING FOR CONSTITUTIVE MODELLING. ScholarBank@NUS Repository.
Abstract: Machine learning (ML) techniques have been applied as surrogate models in the micro scale of multi-scale analysis to reduce the computational cost. These surrogate models provide the constitutive law of the micro scale simulations after a training process, called off-line learning. For history dependent materials, the recurrent neural network (RNN) is adopted for the training of path dependent mechanical responses, due to its advanced learning ability for history dependent behavior. However, the huge data generation time of RNN surrogate model limits the off-line learning efficiency. This thesis aims to address this limitation. Two sequential training strategies are considered. (i) An Error Correction (EC) strategy, where a reference surrogate model is first trained using limited data, and later refined using another trained model on the error correction. (ii) A Knowledge Transfer (KT) strategy, where a source surrogate model is first trained based on a simplified RVE, followed by a refinement of surrogate model based on target data generated with detailed RVEs. This thesis shows that the proposed two strategies, EC and KT, can improve the off-line learning efficiency.
URI: https://scholarbank.nus.edu.sg/handle/10635/233976
Appears in Collections:Master's Theses (Open)

Show full item record
Files in This Item:
File Description SizeFormatAccess SettingsVersion 
DengHX.pdf7.17 MBAdobe PDF

OPEN

NoneView/Download

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.