Please use this identifier to cite or link to this item: https://doi.org/10.1016/j.patcog.2011.03.008
Title: Active learning with adaptive regularization
Authors: Wang, Z.
Yan, S. 
Zhang, C.
Keywords: Active learning
Adaptive regularization
SVM
TSVM
Issue Date: Oct-2011
Citation: Wang, Z., Yan, S., Zhang, C. (2011-10). Active learning with adaptive regularization. Pattern Recognition 44 (10-11) : 2375-2383. ScholarBank@NUS Repository. https://doi.org/10.1016/j.patcog.2011.03.008
Abstract: In classification problems, active learning is often adopted to alleviate the laborious human labeling efforts, by finding the most informative samples to query the labels. One of the most popular query strategy is selecting the most uncertain samples for the current classifier. The performance of such an active learning process heavily relies on the learned classifier before each query. Thus, stepwise classifier model/parameter selection is quite critical, which is, however, rarely studied in the literature. In this paper, we propose a novel active learning support vector machine algorithm with adaptive model selection. In this algorithm, before each new query, we trace the full solution path of the base classifier, and then perform efficient model selection using the unlabeled samples. This strategy significantly improves the active learning efficiency with comparatively inexpensive computational cost. Empirical results on both artificial and real world benchmark data sets show the encouraging gains brought by the proposed algorithm in terms of both classification accuracy and computational cost. © 2011 Elsevier Ltd. All rights reserved.
Source Title: Pattern Recognition
URI: http://scholarbank.nus.edu.sg/handle/10635/54877
ISSN: 00313203
DOI: 10.1016/j.patcog.2011.03.008
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.