Boosting Multi-Label Image Classification with Complementary Parallel Self-Distillation
Abstract
Multi-Label Image Classification (MLIC) appro-aches usually exploit label correlations to achieve good performance. However, emphasizing correlation like co-occurrence may overlook discriminative features and lead to model overfitting. In this study, we propose a generic framework named Parallel Self-Distillation (PSD) for boosting MLIC models. PSD decomposes the original MLIC task into several simpler MLIC sub-tasks via two elaborated complementary task decomposition strategies named Co-occurrence Graph Partition (CGP) and Dis-occurrence Graph Partition (DGP). Then, the MLIC models of fewer categories are trained with these sub-tasks in parallel for respectively learning the joint patterns and the category-specific patterns of labels. Finally, knowledge distillation is leveraged to learn a compact global ensemble of full categories with these learned patterns for reconciling the label correlation exploitation and model overfitting. Extensive results on MS-COCO and NUS-WIDE datasets demonstrate that our framework can be easily plugged into many MLIC approaches and improve performances of recent state-of-the-art approaches. The source code is released at https://github.com/Robbie-Xu/CPSD.
Cite
Text
Xu et al. "Boosting Multi-Label Image Classification with Complementary Parallel Self-Distillation." International Joint Conference on Artificial Intelligence, 2022. doi:10.24963/IJCAI.2022/208Markdown
[Xu et al. "Boosting Multi-Label Image Classification with Complementary Parallel Self-Distillation." International Joint Conference on Artificial Intelligence, 2022.](https://mlanthology.org/ijcai/2022/xu2022ijcai-boosting/) doi:10.24963/IJCAI.2022/208BibTeX
@inproceedings{xu2022ijcai-boosting,
title = {{Boosting Multi-Label Image Classification with Complementary Parallel Self-Distillation}},
author = {Xu, Jiazhi and Huang, Sheng and Zhou, Fengtao and Huangfu, Luwen and Zeng, Daniel and Liu, Bo},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2022},
pages = {1495-1501},
doi = {10.24963/IJCAI.2022/208},
url = {https://mlanthology.org/ijcai/2022/xu2022ijcai-boosting/}
}