SKDBERT: Compressing BERT via Stochastic Knowledge Distillation

Cite

Text

Ding et al. "SKDBERT: Compressing BERT via Stochastic Knowledge Distillation." AAAI Conference on Artificial Intelligence, 2023. doi:10.1609/AAAI.V37I6.25902

Markdown

[Ding et al. "SKDBERT: Compressing BERT via Stochastic Knowledge Distillation." AAAI Conference on Artificial Intelligence, 2023.](https://mlanthology.org/aaai/2023/ding2023aaai-skdbert/) doi:10.1609/AAAI.V37I6.25902

BibTeX

@inproceedings{ding2023aaai-skdbert,
  title     = {{SKDBERT: Compressing BERT via Stochastic Knowledge Distillation}},
  author    = {Ding, Zixiang and Jiang, Guoqing and Zhang, Shuai and Guo, Lin and Lin, Wei},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2023},
  pages     = {7414-7422},
  doi       = {10.1609/AAAI.V37I6.25902},
  url       = {https://mlanthology.org/aaai/2023/ding2023aaai-skdbert/}
}