One-Step Spiking Transformer with a Linear Complexity
Abstract
Graph Transformers (GTs) have emerged as powerful tools for handling graph-structured data through global attention mechanisms. While GTs can effectively capture long-range dependencies, they introduce difficulties in optimization due to their complex, non-differentiable operators, which cannot be directly handled by standard gradient-based optimizers (such as Adam or AdamW). To investigate the above issues, this work adopts the line of Zeroth-Order Optimization (ZOO) technique. However, direct integration of ZOO incurs considerable challenges due to the sharp loss landscape and steep gradients within the GT parameter space. Under the above observations, we propose a Sharpness-aware Zeroth-order Optimizer (SZO) that combines Sharpness-Aware Minimization (SAM) technique facilitating convergence within a flatter neighborhood, and leverages parallel computing for efficient gradient estimation. Theoretically, we provide a comprehensive analysis of the optimizer from both convergence and generalization perspectives. Empirically, we conduct extensive experiments on various classical GTs across a wide range of benchmark datasets, which underscore the superior performance of SZO over the state-of-the-art optimizers.
Cite
Text
Song et al. "One-Step Spiking Transformer with a Linear Complexity." International Joint Conference on Artificial Intelligence, 2024. doi:10.24963/ijcai.2024/348Markdown
[Song et al. "One-Step Spiking Transformer with a Linear Complexity." International Joint Conference on Artificial Intelligence, 2024.](https://mlanthology.org/ijcai/2024/song2024ijcai-one/) doi:10.24963/ijcai.2024/348BibTeX
@inproceedings{song2024ijcai-one,
title = {{One-Step Spiking Transformer with a Linear Complexity}},
author = {Song, Xiaotian and Song, Andy and Xiao, Rong and Sun, Yanan},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2024},
pages = {3142-3150},
doi = {10.24963/ijcai.2024/348},
url = {https://mlanthology.org/ijcai/2024/song2024ijcai-one/}
}