TC-LIF: A Two-Compartment Spiking Neuron Model for Long-Term Sequential Modelling

Abstract

The identification of sensory cues associated with potential opportunities and dangers is frequently complicated by unrelated events that separate useful cues by long delays. As a result, it remains a challenging task for state-of-the-art spiking neural networks (SNNs) to establish long-term temporal dependency between distant cues. To address this challenge, we propose a novel biologically inspired Two-Compartment Leaky Integrate-and-Fire spiking neuron model, dubbed TC-LIF. The proposed model incorporates carefully designed somatic and dendritic compartments that are tailored to facilitate learning long-term temporal dependencies. Furthermore, the theoretical analysis is provided to validate the effectiveness of TC-LIF in propagating error gradients over an extended temporal duration. Our experimental results, on a diverse range of temporal classification tasks, demonstrate superior temporal classification capability, rapid training convergence, and high energy efficiency of the proposed TC-LIF model. Therefore, this work opens up a myriad of opportunities for solving challenging temporal processing tasks on emerging neuromorphic computing systems. Our code is publicly available at https://github.com/ZhangShimin1/TC-LIF.

Cite

Text

Zhang et al. "TC-LIF: A Two-Compartment Spiking Neuron Model for Long-Term Sequential Modelling." AAAI Conference on Artificial Intelligence, 2024. doi:10.1609/AAAI.V38I15.29625

Markdown

[Zhang et al. "TC-LIF: A Two-Compartment Spiking Neuron Model for Long-Term Sequential Modelling." AAAI Conference on Artificial Intelligence, 2024.](https://mlanthology.org/aaai/2024/zhang2024aaai-tc/) doi:10.1609/AAAI.V38I15.29625

BibTeX

@inproceedings{zhang2024aaai-tc,
  title     = {{TC-LIF: A Two-Compartment Spiking Neuron Model for Long-Term Sequential Modelling}},
  author    = {Zhang, Shimin and Yang, Qu and Ma, Chenxiang and Wu, Jibin and Li, Haizhou and Tan, Kay Chen},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2024},
  pages     = {16838-16847},
  doi       = {10.1609/AAAI.V38I15.29625},
  url       = {https://mlanthology.org/aaai/2024/zhang2024aaai-tc/}
}