On the Soft-Subnetwork for Few-Shot Class Incremental Learning

Abstract

Inspired by Regularized Lottery Ticket Hypothesis, which states that competitive smooth (non-binary) subnetworks exist within a dense network, we propose a few-shot class-incremental learning method referred to as Soft-SubNetworks (SoftNet). Our objective is to learn a sequence of sessions incrementally, where each session only includes a few training instances per class while preserving the knowledge of the previously learned ones. SoftNet jointly learns the model weights and adaptive non-binary soft masks at a base training session in which each mask consists of the major and minor subnetwork; the former aims to minimize catastrophic forgetting during training, and the latter aims to avoid overfitting to a few samples in each new training session. We provide comprehensive empirical validations demonstrating that our SoftNet effectively tackles the few-shot incremental learning problem by surpassing the performance of state-of-the-art baselines over benchmark datasets.

Cite

Text

Kang et al. "On the Soft-Subnetwork for Few-Shot Class Incremental Learning." International Conference on Learning Representations, 2023.

Markdown

[Kang et al. "On the Soft-Subnetwork for Few-Shot Class Incremental Learning." International Conference on Learning Representations, 2023.](https://mlanthology.org/iclr/2023/kang2023iclr-softsubnetwork/)

BibTeX

@inproceedings{kang2023iclr-softsubnetwork,
  title     = {{On the Soft-Subnetwork for Few-Shot Class Incremental Learning}},
  author    = {Kang, Haeyong and Yoon, Jaehong and Madjid, Sultan Rizky Hikmawan and Hwang, Sung Ju and Yoo, Chang D.},
  booktitle = {International Conference on Learning Representations},
  year      = {2023},
  url       = {https://mlanthology.org/iclr/2023/kang2023iclr-softsubnetwork/}
}