QFormer: An Efficient Quaternion Transformer for Image Denoising

Abstract

The Spiking Neural Network (SNN) has drawn increasing attention for its energy-efficient, event-driven processing and biological plausibility. To train SNNs via backpropagation, surrogate gradients are used to approximate the non-differentiable spike function, but they only maintain nonzero derivatives within a narrow range of membrane potentials near the firing threshold—referred to as the surrogate gradient support width gamma. We identify a major challenge, termed the dilemma of gamma: a relatively large gamma leads to overactivation, characterized by excessive neuron firing, which in turn increases energy consumption, whereas a small gamma causes vanishing gradients and weakens temporal dependencies. To address this, we propose a temporal Inhibitory Leaky Integrate-and-Fire (ILIF) neuron model, inspired by biological inhibitory mechanisms. This model incorporates interconnected inhibitory units for membrane potential and current, effectively mitigating overactivation while preserving gradient propagation. Theoretical analysis demonstrates ILIF’s effectiveness in overcoming the gamma dilemma, and extensive experiments on multiple datasets show that ILIF improves energy efficiency by reducing firing rates, stabilizes training, and enhances accuracy. The code is available at github.com/kaisun1/ILIF.

Cite

Text

Jiang et al. "QFormer: An Efficient Quaternion Transformer for Image Denoising." International Joint Conference on Artificial Intelligence, 2024. doi:10.24963/ijcai.2024/468

Markdown

[Jiang et al. "QFormer: An Efficient Quaternion Transformer for Image Denoising." International Joint Conference on Artificial Intelligence, 2024.](https://mlanthology.org/ijcai/2024/jiang2024ijcai-qformer/) doi:10.24963/ijcai.2024/468

BibTeX

@inproceedings{jiang2024ijcai-qformer,
  title     = {{QFormer: An Efficient Quaternion Transformer for Image Denoising}},
  author    = {Jiang, Bo and Lu, Yao and Lu, Guangming and Zhang, Bob},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2024},
  pages     = {4237-4245},
  doi       = {10.24963/ijcai.2024/468},
  url       = {https://mlanthology.org/ijcai/2024/jiang2024ijcai-qformer/}
}