TTFSFormer: A TTFS-Based Lossless Conversion of Spiking Transformer

Abstract

ANN-to-SNN conversion has emerged as a key approach to train Spiking Neural Networks (SNNs), particularly for Transformer architectures, as it maps pre-trained ANN parameters to SNN equivalents without requiring retraining, thereby preserving ANN accuracy while eliminating training costs. Among various coding methods used in ANN-to-SNN conversion, time-to-first-spike (TTFS) coding, which allows each neuron to at most one spike, offers significantly lower energy consumption. However, while previous TTFS-based SNNs have achieved comparable performance with convolutional ANNs, the attention mechanism and nonlinear layers in Transformer architectures remains a challenge by existing SNNs with TTFS coding. This paper proposes a new neuron structure for TTFS coding that expands its representational range and enhances the capability to process nonlinear functions, along with detailed designs of nonlinear neurons for different layers in Transformer. Experimental results on different models demonstrate that our proposed method can achieve high accuracy with significantly lower energy consumption. To the best of our knowledge, this is the first work to focus on converting Transformer to SNN with TTFS coding.

Cite

Text

Zhao et al. "TTFSFormer: A TTFS-Based Lossless Conversion of Spiking Transformer." Proceedings of the 42nd International Conference on Machine Learning, 2025.

Markdown

[Zhao et al. "TTFSFormer: A TTFS-Based Lossless Conversion of Spiking Transformer." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/zhao2025icml-ttfsformer/)

BibTeX

@inproceedings{zhao2025icml-ttfsformer,
  title     = {{TTFSFormer: A TTFS-Based Lossless Conversion of Spiking Transformer}},
  author    = {Zhao, Lusen and Huang, Zihan and Ding, Jianhao and Yu, Zhaofei},
  booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
  year      = {2025},
  pages     = {77558-77571},
  volume    = {267},
  url       = {https://mlanthology.org/icml/2025/zhao2025icml-ttfsformer/}
}