Knowledge Restore and Transfer for Multi-Label Class-Incremental Learning
Abstract
Current class-incremental learning research mainly focuses on single-label classification tasks while multi-label class-incremental learning (MLCIL) with more practical application scenarios is rarely studied. Although there have been many anti-forgetting methods to solve the problem of catastrophic forgetting in single-label class-incremental learning, these methods have difficulty in solving the MLCIL problem due to label absence and information dilution problems. To solve these problems, we propose a Knowledge Restore and Transfer (KRT) framework including a dynamic pseudo-label (DPL) module to solve the label absence problem by restoring the knowledge of old classes to the new data and an incremental cross-attention (ICA) module with session-specific knowledge retention tokens storing knowledge and a unified knowledge transfer token transferring knowledge to solve the information dilution problem. Comprehensive experimental results on MS-COCO and PASCAL VOC datasets demonstrate the effectiveness of our method for improving recognition performance and mitigating forgetting on multi-label class-incremental learning tasks.
Cite
Text
Dong et al. "Knowledge Restore and Transfer for Multi-Label Class-Incremental Learning." International Conference on Computer Vision, 2023. doi:10.1109/ICCV51070.2023.01715Markdown
[Dong et al. "Knowledge Restore and Transfer for Multi-Label Class-Incremental Learning." International Conference on Computer Vision, 2023.](https://mlanthology.org/iccv/2023/dong2023iccv-knowledge/) doi:10.1109/ICCV51070.2023.01715BibTeX
@inproceedings{dong2023iccv-knowledge,
title = {{Knowledge Restore and Transfer for Multi-Label Class-Incremental Learning}},
author = {Dong, Songlin and Luo, Haoyu and He, Yuhang and Wei, Xing and Cheng, Jie and Gong, Yihong},
booktitle = {International Conference on Computer Vision},
year = {2023},
pages = {18711-18720},
doi = {10.1109/ICCV51070.2023.01715},
url = {https://mlanthology.org/iccv/2023/dong2023iccv-knowledge/}
}