Efficient ANN-Guided Distillation: Aligning Rate-Based Features of Spiking Neural Networks Through Hybrid Block-Wise Replacement

Abstract

Spiking Neural Networks (SNNs) have garnered considerable attention as a potential alternative to Artificial Neural Networks (ANNs). Recent studies have highlighted SNNs' potential on large-scale datasets. For SNN training, two main approaches exist: direct training and ANN-to-SNN (ANN2SNN) conversion. To fully leverage existing ANN models in guiding SNN learning, either direct ANN-to-SNN conversion or ANN-SNN distillation training can be employed. In this paper, we propose an ANN-SNN distillation framework from the ANN-to-SNN perspective, designed with a block-wise replacement strategy for ANN-guided learning. By generating intermediate hybrid models that progressively align SNN feature spaces to those of ANN through rate-based features, our framework naturally incorporates rate-based backpropagation as a training method. Our approach achieves results comparable to or better than state-of-the-art SNN distillation methods, showing both training and learning efficiency.

Cite

Text

Yang et al. "Efficient ANN-Guided Distillation: Aligning Rate-Based Features of Spiking Neural Networks Through Hybrid Block-Wise Replacement." Conference on Computer Vision and Pattern Recognition, 2025. doi:10.1109/CVPR52734.2025.00937

Markdown

[Yang et al. "Efficient ANN-Guided Distillation: Aligning Rate-Based Features of Spiking Neural Networks Through Hybrid Block-Wise Replacement." Conference on Computer Vision and Pattern Recognition, 2025.](https://mlanthology.org/cvpr/2025/yang2025cvpr-efficient/) doi:10.1109/CVPR52734.2025.00937

BibTeX

@inproceedings{yang2025cvpr-efficient,
  title     = {{Efficient ANN-Guided Distillation: Aligning Rate-Based Features of Spiking Neural Networks Through Hybrid Block-Wise Replacement}},
  author    = {Yang, Shu and Yu, Chengting and Liu, Lei and Ma, Hanzhi and Wang, Aili and Li, Erping},
  booktitle = {Conference on Computer Vision and Pattern Recognition},
  year      = {2025},
  pages     = {10025-10035},
  doi       = {10.1109/CVPR52734.2025.00937},
  url       = {https://mlanthology.org/cvpr/2025/yang2025cvpr-efficient/}
}