Gradient Scaling on Deep Spiking Neural Networks with Spike-Dependent Local Information

Abstract

Deep spiking neural networks (SNNs) are promising neural networks for their model capacity from deep neural network architecture and energy efficiency from SNNs' operations. To train deep SNNs, recently, spatio-temporal backpropagation (STBP) with surrogate gradient was proposed. Although deep SNNs have been successfully trained with STBP, they cannot fully utilize spike information. In this work, we proposed gradient scaling with local spike information, which is the relation between pre- and post-synaptic spikes. Considering the causality between spikes, we could enhance the training performance of deep SNNs. According to our experiments, we could achieve higher accuracy with lower spikes by adopting the gradient scaling on image classification tasks, such as CIFAR10 and CIFAR100.

Cite

Text

Park et al. "Gradient Scaling on Deep Spiking Neural Networks with Spike-Dependent Local Information." ICML 2023 Workshops: LLW, 2023.

Markdown

[Park et al. "Gradient Scaling on Deep Spiking Neural Networks with Spike-Dependent Local Information." ICML 2023 Workshops: LLW, 2023.](https://mlanthology.org/icmlw/2023/park2023icmlw-gradient/)

BibTeX

@inproceedings{park2023icmlw-gradient,
  title     = {{Gradient Scaling on Deep Spiking Neural Networks with Spike-Dependent Local Information}},
  author    = {Park, Seongsik and Jo, Jeonghee and Park, Jongkil and Jeong, Yeonjoo and Kim, Jaewook and Lee, Suyoun and Kwak, Joon young and Kim, Inho and Park, Jong-keuk and Lee, Kyeong seok and Weon, Hwang gyu and Jang, Hyun Jae},
  booktitle = {ICML 2023 Workshops: LLW},
  year      = {2023},
  url       = {https://mlanthology.org/icmlw/2023/park2023icmlw-gradient/}
}