Please use this identifier to cite or link to this item: https://scholarbank.nus.edu.sg/handle/10635/155292
Title: Certainty-Driven Consistency Loss for Semi-supervised Learning
Authors: Li, Yiting 
Liu, Lu
Tan, Robby T 
Keywords: cs.CV
Issue Date: 2019
Citation: Li, Yiting, Liu, Lu, Tan, Robby T (2019). Certainty-Driven Consistency Loss for Semi-supervised Learning. ScholarBank@NUS Repository.
Abstract: The recently proposed semi-supervised learning methods exploit consistency loss between different predictions under random perturbations. Typically, a student model is trained to predict consistently with the targets generated by a noisy teacher. However, they ignore the fact that not all training data provide meaningful and reliable information in terms of consistency. For misclassified data, blindly minimizing the consistency loss around them can hinder learning. In this paper, we propose a novel certainty-driven consistency loss (CCL) to dynamically select data samples that have relatively low uncertainty. Specifically, we measure the variance or entropy of multiple predictions under random augmentations and dropout as an estimation of uncertainty. Then, we introduce two approaches, i.e. Filtering CCL and Temperature CCL to guide the student learn more meaningful and certain/reliable targets, and hence improve the quality of the gradients backpropagated to the student. Experiments demonstrate the advantages of the proposed method over the state-of-the-art semi-supervised deep learning methods on three benchmark datasets: SVHN, CIFAR10, and CIFAR100. Our method also shows robustness to noisy labels.
URI: https://scholarbank.nus.edu.sg/handle/10635/155292
Appears in Collections:Staff Publications
Elements

Show full item record
Files in This Item:
File Description SizeFormatAccess SettingsVersion 
1901.05657v1.pdf1.71 MBAdobe PDF

OPEN

Post-printView/Download

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.