LumiNet: Perception-Driven Knowledge Distillation via Statistical Logit Calibration

Abstract

In the knowledge distillation literature, feature-based methods have dominated due to their ability to effectively tap into extensive teacher models. In contrast, logit-based approaches, which aim to distill `dark knowledge' from teachers, typically exhibit inferior performance compared to feature-based methods. To bridge this gap, we present LumiNet, a novel knowledge distillation algorithm designed to enhance logit-based distillation. We introduce the concept of `perception', aiming to calibrate logits based on the model's representation capability. This concept addresses overconfidence issues in the logit-based distillation method while also introducing a novel method to distill knowledge from the teacher. It reconstructs the logits of a sample/instances by considering relationships with other samples in the batch. LumiNet excels on benchmarks like CIFAR-100, ImageNet, and MSCOCO, outperforming the leading feature-based methods, e.g., compared to KD with ResNet18 and MobileNetV2 on ImageNet, it shows improvements of 1.5\% and 2.05\%, respectively.

Cite

Text

Hossain et al. "LumiNet: Perception-Driven Knowledge Distillation via Statistical Logit Calibration." Transactions on Machine Learning Research, 2025.

Markdown

[Hossain et al. "LumiNet: Perception-Driven Knowledge Distillation via Statistical Logit Calibration." Transactions on Machine Learning Research, 2025.](https://mlanthology.org/tmlr/2025/hossain2025tmlr-luminet/)

BibTeX

@article{hossain2025tmlr-luminet,
  title     = {{LumiNet: Perception-Driven Knowledge Distillation via Statistical Logit Calibration}},
  author    = {Hossain, Md. Ismail and Elahi, M M Lutfe and Ramasinghe, Sameera and Cheraghian, Ali and Rahman, Fuad and Mohammed, Nabeel and Rahman, Shafin},
  journal   = {Transactions on Machine Learning Research},
  year      = {2025},
  url       = {https://mlanthology.org/tmlr/2025/hossain2025tmlr-luminet/}
}