Training Spiking Neural Networks with Local Tandem Learning

Abstract

Spiking neural networks (SNNs) are shown to be more biologically plausible and energy efficient over their predecessors. However, there is a lack of an efficient and generalized training method for deep SNNs, especially for deployment on analog computing substrates. In this paper, we put forward a generalized learning rule, termed Local Tandem Learning (LTL). The LTL rule follows the teacher-student learning approach by mimicking the intermediate feature representations of a pre-trained ANN. By decoupling the learning of network layers and leveraging highly informative supervisor signals, we demonstrate rapid network convergence within five training epochs on the CIFAR-10 dataset while having low computational complexity. Our experimental results have also shown that the SNNs thus trained can achieve comparable accuracies to their teacher ANNs on CIFAR-10, CIFAR-100, and Tiny ImageNet datasets. Moreover, the proposed LTL rule is hardware friendly. It can be easily implemented on-chip to perform fast parameter calibration and provide robustness against the notorious device non-ideality issues. It, therefore, opens up a myriad of opportunities for training and deployment of SNN on ultra-low-power mixed-signal neuromorphic computing chips.

Cite

Text

Yang et al. "Training Spiking Neural Networks with Local Tandem Learning." Neural Information Processing Systems, 2022.

Markdown

[Yang et al. "Training Spiking Neural Networks with Local Tandem Learning." Neural Information Processing Systems, 2022.](https://mlanthology.org/neurips/2022/yang2022neurips-training/)

BibTeX

@inproceedings{yang2022neurips-training,
  title     = {{Training Spiking Neural Networks with Local Tandem Learning}},
  author    = {Yang, Qu and Wu, Jibin and Zhang, Malu and Chua, Yansong and Wang, Xinchao and Li, Haizhou},
  booktitle = {Neural Information Processing Systems},
  year      = {2022},
  url       = {https://mlanthology.org/neurips/2022/yang2022neurips-training/}
}