Spiking Transformer with Spatial-Temporal Attention
Abstract
Spike-based Transformer presents a compelling and energy-efficient alternative to traditional Artificial Neural Network (ANN)-based Transformers, achieving impressive results through sparse binary computations. However, existing spike-based transformers predominantly focus on spatial attention while neglecting crucial temporal dependencies inherent in spike-based processing, leading to suboptimal feature representation and limited performance. To address this limitation, we propose Spiking Transformer with Spatial-Temporal Attention (STAtten), a simple and straightforward architecture that efficiently integrates both spatial and temporal information in the self-attention mechanism. STAtten introduces a block-wise computation strategy that processes information in spatial-temporal chunks, enabling comprehensive feature capture while maintaining the same computational complexity as previous spatial-only approaches. Our method can be seamlessly integrated into existing spike-based transformers without architectural overhaul. Extensive experiments demonstrate that STAtten significantly improves the performance of existing spike-based transformers across both static and neuromorphic datasets, including CIFAR10/100, ImageNet, CIFAR10-DVS, and N-Caltech101.
Cite
Text
Lee et al. "Spiking Transformer with Spatial-Temporal Attention." Conference on Computer Vision and Pattern Recognition, 2025. doi:10.1109/CVPR52734.2025.01302Markdown
[Lee et al. "Spiking Transformer with Spatial-Temporal Attention." Conference on Computer Vision and Pattern Recognition, 2025.](https://mlanthology.org/cvpr/2025/lee2025cvpr-spiking/) doi:10.1109/CVPR52734.2025.01302BibTeX
@inproceedings{lee2025cvpr-spiking,
title = {{Spiking Transformer with Spatial-Temporal Attention}},
author = {Lee, Donghyun and Li, Yuhang and Kim, Youngeun and Xiao, Shiting and Panda, Priyadarshini},
booktitle = {Conference on Computer Vision and Pattern Recognition},
year = {2025},
pages = {13948-13958},
doi = {10.1109/CVPR52734.2025.01302},
url = {https://mlanthology.org/cvpr/2025/lee2025cvpr-spiking/}
}