Preserving Angles Improves Feature Distillation
Abstract
Knowledge distillation methods compress models by training a student network using the classification outputs of a high quality teacher model, but can fail to effectively transfer the properties of computer vision foundation models from the teacher to the student. While it has been recently shown that feature distillation—where a teacher model's output features are replicated instead—can reproduce performance for foundation models across numerous downstream tasks, they fall short in matching critical properties such as robustness and out-of-distribution (OOD) detection performance. This paper overcomes this shortcoming by introducing Cosine-similarity Preserving Compression (CosPress), a feature distillation technique that learns a mapping to compress the latent space of the teacher model into the smaller latent space of the student, by preserving the cosine similarities between image embeddings. This enables direct optimisation of the student network and produces a more faithful reproduction of the teacher's properties. It is shown that distillation with CosPress on a variety of datasets, including ImageNet, produces more accurate models with greater performance on generalisability, robustness and OOD detection benchmarks, and that this technique provides a competitive pathway for training highly performant lightweight models on small datasets. Code is available at github.com/emannix/cospress.
Cite
Text
Mannix et al. "Preserving Angles Improves Feature Distillation." Transactions on Machine Learning Research, 2025.Markdown
[Mannix et al. "Preserving Angles Improves Feature Distillation." Transactions on Machine Learning Research, 2025.](https://mlanthology.org/tmlr/2025/mannix2025tmlr-preserving/)BibTeX
@article{mannix2025tmlr-preserving,
title = {{Preserving Angles Improves Feature Distillation}},
author = {Mannix, Evelyn and Hodgkinson, Liam and Bondell, Howard},
journal = {Transactions on Machine Learning Research},
year = {2025},
url = {https://mlanthology.org/tmlr/2025/mannix2025tmlr-preserving/}
}