SpikingBERT: Distilling BERT to Train Spiking Language Models Using Implicit Differentiation

Cite

Text

Bal and Sengupta. "SpikingBERT: Distilling BERT to Train Spiking Language Models Using Implicit Differentiation." AAAI Conference on Artificial Intelligence, 2024. doi:10.1609/AAAI.V38I10.28975

Markdown

[Bal and Sengupta. "SpikingBERT: Distilling BERT to Train Spiking Language Models Using Implicit Differentiation." AAAI Conference on Artificial Intelligence, 2024.](https://mlanthology.org/aaai/2024/bal2024aaai-spikingbert/) doi:10.1609/AAAI.V38I10.28975

BibTeX

@inproceedings{bal2024aaai-spikingbert,
  title     = {{SpikingBERT: Distilling BERT to Train Spiking Language Models Using Implicit Differentiation}},
  author    = {Bal, Malyaban and Sengupta, Abhronil},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2024},
  pages     = {10998-11006},
  doi       = {10.1609/AAAI.V38I10.28975},
  url       = {https://mlanthology.org/aaai/2024/bal2024aaai-spikingbert/}
}