Self-Distillation Regularized Connectionist Temporal Classification Loss for Text Recognition: A Simple yet Effective Approach

Cite

Text

Zhang et al. "Self-Distillation Regularized Connectionist Temporal Classification Loss for Text Recognition: A Simple yet Effective Approach." AAAI Conference on Artificial Intelligence, 2024. doi:10.1609/AAAI.V38I7.28575

Markdown

[Zhang et al. "Self-Distillation Regularized Connectionist Temporal Classification Loss for Text Recognition: A Simple yet Effective Approach." AAAI Conference on Artificial Intelligence, 2024.](https://mlanthology.org/aaai/2024/zhang2024aaai-self/) doi:10.1609/AAAI.V38I7.28575

BibTeX

@inproceedings{zhang2024aaai-self,
  title     = {{Self-Distillation Regularized Connectionist Temporal Classification Loss for Text Recognition: A Simple yet Effective Approach}},
  author    = {Zhang, Ziyin and Lu, Ning and Liao, Minghui and Huang, Yongshuai and Li, Cheng and Wang, Min and Peng, Wei},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2024},
  pages     = {7441-7449},
  doi       = {10.1609/AAAI.V38I7.28575},
  url       = {https://mlanthology.org/aaai/2024/zhang2024aaai-self/}
}