Deep Activity Propagation via Weight Initialization in Spiking Neural Networks

Abstract

Spiking Neural Networks (SNNs) offer advantages such as sparsity and ultra-low power consumption, making them a promising alternative to conventional neural networks (ANNs). However, training deep SNNs is challenging due to the quantization of membrane potentials into binary spikes, which can cause information loss and vanishing spikes in deeper layers. Traditional weight initialization methods from ANNs are often used in SNNs without accounting for their distinct computational properties. In this work, we derive an optimal weight initialization method tailored for SNNs, specifically taking into account the quantization operation. We demonstrate through theoretical analysis and simulations with up to 100 layers that our method enables the propagation of activity in deep SNNs without loss of spikes. Experiments on MNIST confirm that the proposed initialization scheme leads to higher accuracy, faster convergence, and robustness against variations in network and neuron hyperparameters.

Cite

Text

Micheli et al. "Deep Activity Propagation via Weight Initialization in Spiking Neural Networks." NeurIPS 2024 Workshops: MLNCP, 2024.

Markdown

[Micheli et al. "Deep Activity Propagation via Weight Initialization in Spiking Neural Networks." NeurIPS 2024 Workshops: MLNCP, 2024.](https://mlanthology.org/neuripsw/2024/micheli2024neuripsw-deep/)

BibTeX

@inproceedings{micheli2024neuripsw-deep,
  title     = {{Deep Activity Propagation via Weight Initialization in Spiking Neural Networks}},
  author    = {Micheli, Aurora and Booij, Olaf and van Gemert, Jan and Tomen, Nergis},
  booktitle = {NeurIPS 2024 Workshops: MLNCP},
  year      = {2024},
  url       = {https://mlanthology.org/neuripsw/2024/micheli2024neuripsw-deep/}
}