Density mAP Distillation for Incremental Object Counting

Abstract

We investigate the problem of incremental learning for object counting, where a method must learn to count a variety of object classes from a sequence of datasets. A naïve approach to incremental object counting would suffer from catastrophic forgetting, where it would suffer from a dramatic performance drop on previous tasks. In this paper, we propose a new exemplar-free functional regularization method, called Density Map Distillation (DMD). During training, we introduce a new counter head for each task and introduce a distillation loss to prevent forgetting of previous tasks. Additionally, we introduce a cross-task adaptor that projects the features of the current backbone to the previous backbone. This projector allows for the learning of new features while the backbone retains the relevant features for previous tasks. Finally, we set up experiments of incremental learning for counting new objects. Results confirm that our method greatly reduces catastrophic forgetting and outperforms existing methods.

Cite

Text

Wu and van de Weijer. "Density mAP Distillation for Incremental Object Counting." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2023. doi:10.1109/CVPRW59228.2023.00249

Markdown

[Wu and van de Weijer. "Density mAP Distillation for Incremental Object Counting." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2023.](https://mlanthology.org/cvprw/2023/wu2023cvprw-density/) doi:10.1109/CVPRW59228.2023.00249

BibTeX

@inproceedings{wu2023cvprw-density,
  title     = {{Density mAP Distillation for Incremental Object Counting}},
  author    = {Wu, Chenshen and van de Weijer, Joost},
  booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops},
  year      = {2023},
  pages     = {2506-2515},
  doi       = {10.1109/CVPRW59228.2023.00249},
  url       = {https://mlanthology.org/cvprw/2023/wu2023cvprw-density/}
}