Please use this identifier to cite or link to this item: https://scholarbank.nus.edu.sg/handle/10635/162736
Title: LABEL EFFICIENT LEARNING BEYOND MANUAL ANNOTATIONS
Authors: LI JUNNAN
Keywords: image classification, label noise learning, unsupervised learning, action recognition, transfer learning, domain adaptation
Issue Date: 28-Jun-2019
Citation: LI JUNNAN (2019-06-28). LABEL EFFICIENT LEARNING BEYOND MANUAL ANNOTATIONS. ScholarBank@NUS Repository.
Abstract: The supervised learning paradigm paired with the capacity of Deep Neural Networks (DNNs) have significantly advanced the state-of-the-art performance for many computer vision tasks. However, enormous amount of labeled data is required for supervised learning to achieve good generalization performance. Curating such large-scale datasets is extremely expensive, labor-intensive, and time-consuming. Furthermore, certain domains such as medical imaging are inherently data-sparse. Therefore, it is important to study label-efficient learning paradigms for training of deep networks. In this thesis, we contribute to three major directions that alleviate the dependence on manual annotations. First, we present an unsupervised method which leverages unlabeled RGB-D videos to learn action representations. Then, we propose a meta-learning based noise-tolerant training method to learn from noisy labeled data. We also study transfer learning, where we address the domain shift problem by transferring the knowledge embodied attention maps. Finally, we propose a gradient-based method for multi-source cross-domain transfer.
URI: https://scholarbank.nus.edu.sg/handle/10635/162736
Appears in Collections:Ph.D Theses (Open)

Show full item record
Files in This Item:
File Description SizeFormatAccess SettingsVersion 
thesis.pdf8.84 MBAdobe PDF

OPEN

NoneView/Download

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.