Meta Module Generation for Fast Few-Shot Incremental Learning

Abstract

There are two challenging problems in applying standard Deep Neural Networks (DNNs) for incremental learning with a few examples: (i) DNNs do not perform well when little training data is available; (ii) DNNs suffer from catastrophic forgetting when used for incremental class learning. To simultaneously address both problems, we propose Meta Module Generation (MetaMG), a meta-learning method that enables a module generator to rapidly generate a category module from a few examples for a scalable classification network to recognize a new category. The old categories are not forgotten after new categories are added in. Comprehensive experiments conducted on 4 datasets show that our method is promising for fast incremental learning in few-shot setting. Further experiments on the miniImageNet dataset show that even it is not specially designed for the N-wayK-shot learning problem, MetaMG can sitll perform relatively well especially for 20-way K-shot setting.

Cite

Text

Xie et al. "Meta Module Generation for Fast Few-Shot Incremental Learning." IEEE/CVF International Conference on Computer Vision Workshops, 2019. doi:10.1109/ICCVW.2019.00174

Markdown

[Xie et al. "Meta Module Generation for Fast Few-Shot Incremental Learning." IEEE/CVF International Conference on Computer Vision Workshops, 2019.](https://mlanthology.org/iccvw/2019/xie2019iccvw-meta/) doi:10.1109/ICCVW.2019.00174

BibTeX

@inproceedings{xie2019iccvw-meta,
  title     = {{Meta Module Generation for Fast Few-Shot Incremental Learning}},
  author    = {Xie, Shudong and Li, Yiqun and Lin, Dongyun and Nwe, Tin Lay and Dong, Sheng},
  booktitle = {IEEE/CVF International Conference on Computer Vision Workshops},
  year      = {2019},
  pages     = {1381-1390},
  doi       = {10.1109/ICCVW.2019.00174},
  url       = {https://mlanthology.org/iccvw/2019/xie2019iccvw-meta/}
}