S$^2$NN: Sub-Bit Spiking Neural Networks
Abstract
Spiking Neural Networks (SNNs) offer an energy-efficient paradigm for machine intelligence, but their continued scaling poses challenges for resource-limited deployment. Despite recent advances in binary SNNs, the storage and computational demands remain substantial for large-scale networks. To further explore the compression and acceleration potential of SNNs, we propose Sub-bit Spiking Neural Networks (S$^2$NNs) that represent weights with less than one bit. Specifically, we first establish an S$^2$NN baseline by leveraging the clustering patterns of kernels in well-trained binary SNNs. This baseline is highly efficient but suffers from \textit{outlier-induced codeword selection bias} during training. To mitigate this issue, we propose an \textit{outlier-aware sub-bit weight quantization} (OS-Quant) method, which optimizes codeword selection by identifying and adaptively scaling outliers. Furthermore, we propose a \textit{membrane potential-based feature distillation} (MPFD) method, improving the performance of highly compressed S$^2$NN via more precise guidance from a teacher model. Extensive results on vision reveal that S$^2$NN outperforms existing quantized SNNs in both performance and efficiency, making it promising for edge computing applications.
Cite
Text
Wei et al. "S$^2$NN: Sub-Bit Spiking Neural Networks." Advances in Neural Information Processing Systems, 2025.Markdown
[Wei et al. "S$^2$NN: Sub-Bit Spiking Neural Networks." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/wei2025neurips-2nn/)BibTeX
@inproceedings{wei2025neurips-2nn,
title = {{S$^2$NN: Sub-Bit Spiking Neural Networks}},
author = {Wei, Wenjie and Zhang, Malu and Zhang, Jieyuan and Belatreche, Ammar and Wang, Shuai and Shan, Yimeng and Liu, Hanwen and Cao, Honglin and Wang, Guoqing and Yang, Yang and Li, Haizhou},
booktitle = {Advances in Neural Information Processing Systems},
year = {2025},
url = {https://mlanthology.org/neurips/2025/wei2025neurips-2nn/}
}