Please use this identifier to cite or link to this item: https://scholarbank.nus.edu.sg/handle/10635/239056
Title: APPLICATIONS OF DOMAIN DIVERGENCES FOR DOMAIN ADAPTATION IN NLP
Authors: ABHINAV RAMESH KASHYAP
ORCID iD:   orcid.org/0000-0002-0505-4521
Keywords: nlp;domain adaptation; divergence measures; parameter-efficiency; style transfer; robustness
Issue Date: 23-Dec-2022
Citation: ABHINAV RAMESH KASHYAP (2022-12-23). APPLICATIONS OF DOMAIN DIVERGENCES FOR DOMAIN ADAPTATION IN NLP. ScholarBank@NUS Repository.
Abstract: The robustness of machine learning models is important for their deployment in various conditions, as they may not work well under different input distributions. Domain divergence is a mathematical tool used to quantify the difference between two input distributions, and its understanding and application can improve the usefulness of machine learning models. This thesis explores the different applications of divergence measures, particularly in Natural Language Processing (NLP). First, it provides a taxonomy of divergence measures. The thesis makes contributions in three areas: predicting model performance drop under new distributions, aligning source and target domain representations for novel applications, and understanding the robustness of models under different domains. Additionally, the thesis proposes a method for applying divergence measures in a parameter-efficient manner for domain adaptation in NLP. The work concludes with limitations and future research avenues.
URI: https://scholarbank.nus.edu.sg/handle/10635/239056
Appears in Collections:Ph.D Theses (Open)

Show full item record
Files in This Item:
File Description SizeFormatAccess SettingsVersion 
AbhinavRameshKashyap.pdf27.76 MBAdobe PDF

OPEN

NoneView/Download

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.