SpikeZIP-TF: Conversion Is All You Need for Transformer-Based SNN
Abstract
Spiking neural network (SNN) has attracted great attention due to its characteristic of high efficiency and accuracy. Currently, the ANN-to-SNN conversion methods can obtain ANN on-par accuracy SNN with ultra-low latency (8 time-steps) in CNN structure on computer vision (CV) tasks. However, as Transformer-based networks have achieved prevailing precision on both CV and natural language processing (NLP), the Transformer-based SNNs are still encounting the lower accuracy w.r.t the ANN counterparts. In this work, we introduce a novel ANN-to-SNN conversion method called SpikeZIP-TF, where ANN and SNN are exactly equivalent, thus incurring no accuracy degradation. SpikeZIP-TF achieves 83.82% accuracy on CV dataset (ImageNet) and 93.79% accuracy on NLP dataset (SST-2), which are higher than SOTA Transformer-based SNNs. The code is available in GitHub: https://github.com/Intelligent-Computing-Research-Group/SpikeZIP_transformer
Cite
Text
You et al. "SpikeZIP-TF: Conversion Is All You Need for Transformer-Based SNN." International Conference on Machine Learning, 2024.Markdown
[You et al. "SpikeZIP-TF: Conversion Is All You Need for Transformer-Based SNN." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/you2024icml-spikeziptf/)BibTeX
@inproceedings{you2024icml-spikeziptf,
title = {{SpikeZIP-TF: Conversion Is All You Need for Transformer-Based SNN}},
author = {You, Kang and Xu, Zekai and Nie, Chen and Deng, Zhijie and Guo, Qinghai and Wang, Xiang and He, Zhezhi},
booktitle = {International Conference on Machine Learning},
year = {2024},
pages = {57367-57383},
volume = {235},
url = {https://mlanthology.org/icml/2024/you2024icml-spikeziptf/}
}