Leveraging Topological Guidance for Improved Knowledge Distillation
Abstract
Deep learning has shown its proficiency in extracting useful features to solve various computer vision tasks. However, when the structure of the data is complex and noisy, capturing effective information to improve performance is very difficult. To this end, topological data analysis (TDA) has been utilized to provide useful data that can contribute to improving performance and robustness against perturbations. Despite its effectiveness, the requirements for large computational resources and significant time consumption in extracting topological features through TDA are critical problems when implementing it on small devices. To address this issue, we propose a framework called Topological Guidance-based Knowledge Distillation (TGD), which uses topological features in knowledge distillation (KD) for image classification tasks. We utilize KD to train a superior lightweight model and provide topological features with multiple teachers simultaneously. We introduce a mechanism for integrating features from different teachers and reducing the knowledge gap between teachers and the student, which aids in improving performance. We demonstrate the effectiveness of our approach through diverse empirical evaluations.
Cite
Text
Jeon et al. "Leveraging Topological Guidance for Improved Knowledge Distillation." ICML 2024 Workshops: GRaM, 2024.Markdown
[Jeon et al. "Leveraging Topological Guidance for Improved Knowledge Distillation." ICML 2024 Workshops: GRaM, 2024.](https://mlanthology.org/icmlw/2024/jeon2024icmlw-leveraging/)BibTeX
@inproceedings{jeon2024icmlw-leveraging,
title = {{Leveraging Topological Guidance for Improved Knowledge Distillation}},
author = {Jeon, Eun Som and Khurana, Rahul and Pathak, Aishani and Turaga, Pavan K.},
booktitle = {ICML 2024 Workshops: GRaM},
year = {2024},
url = {https://mlanthology.org/icmlw/2024/jeon2024icmlw-leveraging/}
}