Please use this identifier to cite or link to this item:
Title: Learning to Self-Train for Semi-Supervised Few-Shot Classification
Authors: Xinzhe Li 
Qianru Sun 
Yaoyao Liu
Qin Zhou
Shibao Zheng
Tat-Seng Chua 
Bernt Schiele
Keywords: Few-shot classification
deep neural network
learning to self-train
Issue Date: 8-Dec-2019
Citation: Xinzhe Li, Qianru Sun, Yaoyao Liu, Qin Zhou, Shibao Zheng, Tat-Seng Chua, Bernt Schiele (2019-12-08). Learning to Self-Train for Semi-Supervised Few-Shot Classification. NeurIPS 2019. ScholarBank@NUS Repository.
Abstract: Few-shot classification (FSC) is challenging due to the scarcity of labeled training data (e.g. only one labeled data point per class). Meta-learning has shown to achieve promising results by learning to initialize a classification model for FSC. In this paper we propose a novel semi-supervised meta-learning method called learning to self-train (LST) that leverages unlabeled data and specifically metalearns how to cherry-pick and label such unsupervised data to further improve performance. To this end, we train the LST model through a large number of semi-supervised few-shot tasks. On each task, we train a few-shot model to predict pseudo labels for unlabeled data, and then iterate the self-training steps on labeled and pseudo-labeled data with each step followed by fine-tuning. We additionally learn a soft weighting network (SWN) to optimize the self-training weights of pseudo labels so that better ones can contribute more to gradient descent optimization. We evaluate our LST method on two ImageNet benchmarks for semi-supervised few-shot classification and achieve large improvements over the state-of-the-art method. Code is at
Source Title: NeurIPS 2019
DOI: arXiv:1906.00562
Appears in Collections:Elements
Staff Publications

Show full item record
Files in This Item:
File Description SizeFormatAccess SettingsVersion 
Learning to Self-Train for Semi-Supervised Few-Shot Classification.pdf3.7 MBAdobe PDF



Page view(s)

checked on Jun 4, 2020

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.