UNIC: Universal Classification Models via Multi-Teacher Distillation
Abstract
Pretrained models have become a commodity and offer strong results on a broad range of tasks. In this work, we focus on classification and seek to learn a unique encoder able to take from several complementary pretrained models. We aim at even stronger generalization across a variety of classification tasks. We propose to learn such an encoder via multi-teacher distillation. We first thoroughly analyze standard distillation when driven by multiple strong teachers with complementary strengths. Guided by this analysis, we gradually propose improvements to the basic distillation setup. Among those, we enrich the architecture of the encoder with a ladder of expendable projectors, which increases the impact of intermediate features during distillation, and we introduce teacher dropping, a regularization mechanism that better balances the teachers’ influence. Our final distillation strategy leads to student models of the same capacity as any of the teachers, while retaining or improving upon the performance of the best teacher for each task.
Cite
Text
Kalantidis et al. "UNIC: Universal Classification Models via Multi-Teacher Distillation." Proceedings of the European Conference on Computer Vision (ECCV), 2024. doi:10.1007/978-3-031-73235-5_20Markdown
[Kalantidis et al. "UNIC: Universal Classification Models via Multi-Teacher Distillation." Proceedings of the European Conference on Computer Vision (ECCV), 2024.](https://mlanthology.org/eccv/2024/kalantidis2024eccv-unic/) doi:10.1007/978-3-031-73235-5_20BibTeX
@inproceedings{kalantidis2024eccv-unic,
title = {{UNIC: Universal Classification Models via Multi-Teacher Distillation}},
author = {Kalantidis, Yannis and Larlus, Diane and Sariyildiz, Mert Bulent and Weinzaepfel, Philippe and Lucas, Thomas},
booktitle = {Proceedings of the European Conference on Computer Vision (ECCV)},
year = {2024},
doi = {10.1007/978-3-031-73235-5_20},
url = {https://mlanthology.org/eccv/2024/kalantidis2024eccv-unic/}
}