Semi-Supervised Active Learning with Cross-Class Sample Transfer
Abstract
To save the labeling efforts for training a classification model, we can simultaneously adopt Active Learning (AL) to select the most informative samples for human labeling, and Semi-supervised Learning (SSL) to construct effective classifiers using a few labeled samples and a large number of unlabeled samples. Recently, using Transfer Learning (TL) to enhance AL and SSL, i.e., T-SS-AL, has gained considerable attention. However, existing T-SS-AL methods mostly focus on the situation where the source domain and the target domain share the same classes. In this paper, we consider a more practical and challenging setting where the source domain and the target domain have different but related classes. We propose a novel cross-class sample transfer based T-SS-AL method, called CC-SS-AL, to exploit the information from the source domain. Our key idea is to select samples from the source domain which are very similar to the target domain classes and assign pseudo labels to them for classifier training. Extensive experiments on three datasets verify the efficacy of the proposed method. PDF
Cite
Text
Guo et al. "Semi-Supervised Active Learning with Cross-Class Sample Transfer." International Joint Conference on Artificial Intelligence, 2016.Markdown
[Guo et al. "Semi-Supervised Active Learning with Cross-Class Sample Transfer." International Joint Conference on Artificial Intelligence, 2016.](https://mlanthology.org/ijcai/2016/guo2016ijcai-semi/)BibTeX
@inproceedings{guo2016ijcai-semi,
title = {{Semi-Supervised Active Learning with Cross-Class Sample Transfer}},
author = {Guo, Yuchen and Ding, Guiguang and Gao, Yue and Wang, Jianmin},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2016},
pages = {1526-1532},
url = {https://mlanthology.org/ijcai/2016/guo2016ijcai-semi/}
}