A Soft Nearest-Neighbor Framework for Continual Semi-Supervised Learning
Abstract
Despite significant advances, the performance of state-of-the-art continual learning approaches hinges on the unrealistic scenario of fully labeled data. In this paper, we tackle this challenge and propose an approach for continual semi-supervised learning--a setting where not all the data samples are labeled. A primary issue in this scenario is the model forgetting representations of unlabeled data and overfitting the labeled samples. We leverage the power of nearest-neighbor classifiers to nonlinearly partition the feature space and flexibly model the underlying data distribution thanks to its non-parametric nature. This enables the model to learn a strong representation for the current task, and distill relevant information from previous tasks. We perform a thorough experimental evaluation and show that our method outperforms all the existing approaches by large margins, setting a solid state of the art on the continual semi-supervised learning paradigm. For example, on CIFAR-100 we surpass several others even when using at least 30 times less supervision (0.8% vs. 25% of annotations). Finally, our method works well on both low and high resolution images and scales seamlessly to more complex datasets such as ImageNet-100.
Cite
Text
Kang et al. "A Soft Nearest-Neighbor Framework for Continual Semi-Supervised Learning." International Conference on Computer Vision, 2023. doi:10.1109/ICCV51070.2023.01090Markdown
[Kang et al. "A Soft Nearest-Neighbor Framework for Continual Semi-Supervised Learning." International Conference on Computer Vision, 2023.](https://mlanthology.org/iccv/2023/kang2023iccv-soft/) doi:10.1109/ICCV51070.2023.01090BibTeX
@inproceedings{kang2023iccv-soft,
title = {{A Soft Nearest-Neighbor Framework for Continual Semi-Supervised Learning}},
author = {Kang, Zhiqi and Fini, Enrico and Nabi, Moin and Ricci, Elisa and Alahari, Karteek},
booktitle = {International Conference on Computer Vision},
year = {2023},
pages = {11868-11877},
doi = {10.1109/ICCV51070.2023.01090},
url = {https://mlanthology.org/iccv/2023/kang2023iccv-soft/}
}