Yu, Chengting

5 publications

CVPR 2025 Efficient ANN-Guided Distillation: Aligning Rate-Based Features of Spiking Neural Networks Through Hybrid Block-Wise Replacement Shu Yang, Chengting Yu, Lei Liu, Hanzhi Ma, Aili Wang, Erping Li
ICML 2025 Efficient Logit-Based Knowledge Distillation of Deep Spiking Neural Networks for Full-Range Timestep Deployment Chengting Yu, Xiaochen Zhao, Lei Liu, Shu Yang, Gaoang Wang, Erping Li, Aili Wang
NeurIPS 2025 Enhanced Self-Distillation Framework for Efficient Spiking Neural Network Training Xiaochen Zhao, Chengting Yu, Kairong Yu, Lei Liu, Aili Wang
CVPR 2025 Temporal Separation with Entropy Regularization for Knowledge Distillation in Spiking Neural Networks Kairong Yu, Chengting Yu, Tianqing Zhang, Xiaochen Zhao, Shu Yang, Hongwei Wang, Qiang Zhang, Qi Xu
NeurIPS 2024 Advancing Training Efficiency of Deep Spiking Neural Networks Through Rate-Based Backpropagation Chengting Yu, Lei Liu, Gaoang Wang, Erping Li, Aili Wang