Aligning Logits Generatively for Principled Black-Box Knowledge Distillation
Abstract
Black-Box Knowledge Distillation (B2KD) is a formulated problem for cloud-to-edge model compression with invisible data and models hosted on the server. B2KD faces challenges such as limited Internet exchange and edge-cloud disparity of data distributions. In this paper we formalize a two-step workflow consisting of deprivatization and distillation and theoretically provide a new optimization direction from logits to cell boundary different from direct logits alignment. With its guidance we propose a new method Mapping-Emulation KD (MEKD) that distills a black-box cumbersome model into a lightweight one. Our method does not differentiate between treating soft or hard responses and consists of: 1) deprivatization: emulating the inverse mapping of the teacher function with a generator and 2) distillation: aligning low-dimensional logits of the teacher and student models by reducing the distance of high-dimensional image points. For different teacher-student pairs our method yields inspiring distillation performance on various benchmarks and outperforms the previous state-of-the-art approaches.
Cite
Text
Ma et al. "Aligning Logits Generatively for Principled Black-Box Knowledge Distillation." Conference on Computer Vision and Pattern Recognition, 2024. doi:10.1109/CVPR52733.2024.02184Markdown
[Ma et al. "Aligning Logits Generatively for Principled Black-Box Knowledge Distillation." Conference on Computer Vision and Pattern Recognition, 2024.](https://mlanthology.org/cvpr/2024/ma2024cvpr-aligning/) doi:10.1109/CVPR52733.2024.02184BibTeX
@inproceedings{ma2024cvpr-aligning,
title = {{Aligning Logits Generatively for Principled Black-Box Knowledge Distillation}},
author = {Ma, Jing and Xiang, Xiang and Wang, Ke and Wu, Yuchuan and Li, Yongbin},
booktitle = {Conference on Computer Vision and Pattern Recognition},
year = {2024},
pages = {23148-23157},
doi = {10.1109/CVPR52733.2024.02184},
url = {https://mlanthology.org/cvpr/2024/ma2024cvpr-aligning/}
}