Please use this identifier to cite or link to this item: https://scholarbank.nus.edu.sg/handle/10635/218203
Title: CROSS-LINGUAL LANGUAGE MODELING: METHODS AND APPLICATIONS
Authors: GRANDEE LEE
ORCID iD:   orcid.org/0000-0001-9603-5141
Keywords: Cross-lingual learning, code-switching, word embedding, language modeling
Issue Date: 8-Oct-2021
Citation: GRANDEE LEE (2021-10-08). CROSS-LINGUAL LANGUAGE MODELING: METHODS AND APPLICATIONS. ScholarBank@NUS Repository.
Abstract: Cross-lingual learning aims to connect the numerous languages in the world by bringing monolingual systems into a multi-lingual space through which the system can realize cross-lingual transfer in tasks like zero-shot POS tagging, retrieval or classification. It also enables applications like the modeling of code-switching languages. These applications present different challenges, such as the data is often small, and in terms of the code-switching domain, the data is sparse. A more systematic challenge is the performance degradation related to the language distance and the difference in language structures and domains. This thesis seeks to address some of these questions. Firstly, we propose an information-theoretic framework for understanding the working of unsupervised cross-lingual learning. Secondly, we delve into the linguistically motivated data augmentation method. Next, we propose a novel neural back-off scheme in the language model. Lastly, we validate our methods in downstream tasks such as speech recognition and synthesis.
URI: https://scholarbank.nus.edu.sg/handle/10635/218203
Appears in Collections:Ph.D Theses (Open)

Show full item record
Files in This Item:
File Description SizeFormatAccess SettingsVersion 
LeeGrandee.pdf2.07 MBAdobe PDF

OPEN

NoneView/Download

Page view(s)

39
checked on Dec 1, 2022

Download(s)

5
checked on Dec 1, 2022

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.