Adaptive Fission: Post-Training Encoding for Low-Latency Spike Neural Networks
Abstract
Spiking Neural Networks (SNNs) often rely on rate coding, where high-precision inference depends on long time-steps, leading to significant latency and energy cost—especially for ANN-to-SNN conversions. To address this, we propose Adaptive Fission, a post-training encoding technique that selectively splits high-sensitivity neurons into groups with varying scales and weights. This enables neuron-specific, on-demand precision and threshold allocation while introducing minimal spatial overhead. As a generalized form of population coding, it seamlessly applies to a wide range of pretrained SNN architectures without requiring additional training or fine-tuning. Experiments on neuromorphic hardware demonstrate up to 80\% reductions in latency and power consumption without degrading accuracy.
Cite
Text
Jiang et al. "Adaptive Fission: Post-Training Encoding for Low-Latency Spike Neural Networks." Advances in Neural Information Processing Systems, 2025.Markdown
[Jiang et al. "Adaptive Fission: Post-Training Encoding for Low-Latency Spike Neural Networks." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/jiang2025neurips-adaptive/)BibTeX
@inproceedings{jiang2025neurips-adaptive,
title = {{Adaptive Fission: Post-Training Encoding for Low-Latency Spike Neural Networks}},
author = {Jiang, Yizhou and Chen, Feng and Li, Yihan and Liu, Yuqian and Gao, Haichuan and Zhang, Tianren and Fang, Ying},
booktitle = {Advances in Neural Information Processing Systems},
year = {2025},
url = {https://mlanthology.org/neurips/2025/jiang2025neurips-adaptive/}
}