Improving Diversity in Black-Box Few-Shot Knowledge Distillation
Abstract
Knowledge distillation (KD) is a well-known technique to effectively compress a large network ( teacher ) to a smaller network ( student ) with little sacrifice in performance. However, most KD methods require a large training set and internal access to the teacher, which are rarely available due to various restrictions. These challenges have originated a more practical setting known as black-box few-shot KD , where the student is trained with few images and a black-box teacher. Recent approaches typically generate additional synthetic images but lack an active strategy to promote their diversity, a crucial factor for student learning. To address these problems, we propose a novel training scheme for generative adversarial networks, where we adaptively select high-confidence images under the teacher’s supervision and introduce them to the adversarial learning on-the-fly. Our approach helps expand and improve the diversity of the distillation set, significantly boosting student accuracy. Through extensive experiments, we achieve state-of-the-art results among other few-shot KD methods on seven image datasets. The code is available at https://github.com/votrinhan88/divbfkd .
Cite
Text
Vo et al. "Improving Diversity in Black-Box Few-Shot Knowledge Distillation." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2024. doi:10.1007/978-3-031-70344-7_11Markdown
[Vo et al. "Improving Diversity in Black-Box Few-Shot Knowledge Distillation." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2024.](https://mlanthology.org/ecmlpkdd/2024/vo2024ecmlpkdd-improving/) doi:10.1007/978-3-031-70344-7_11BibTeX
@inproceedings{vo2024ecmlpkdd-improving,
title = {{Improving Diversity in Black-Box Few-Shot Knowledge Distillation}},
author = {Vo, Tri-Nhan and Nguyen, Dang and Do, Kien and Gupta, Sunil},
booktitle = {European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases},
year = {2024},
pages = {178-196},
doi = {10.1007/978-3-031-70344-7_11},
url = {https://mlanthology.org/ecmlpkdd/2024/vo2024ecmlpkdd-improving/}
}