Towards Minimal Supervision BERT-Based Grammar Error Correction (Student Abstract)

Abstract

Current grammatical error correction (GEC) models typically consider the task as sequence generation, which requires large amounts of annotated data and limit the applications in data-limited settings. We try to incorporate contextual information from pre-trained language model to leverage annotation and benefit multilingual scenarios. Results show strong potential of Bidirectional Encoder Representations from Transformers (BERT) in grammatical error correction task.

Cite

Text

Li et al. "Towards Minimal Supervision BERT-Based Grammar Error Correction (Student Abstract)." AAAI Conference on Artificial Intelligence, 2020. doi:10.1609/AAAI.V34I10.7202

Markdown

[Li et al. "Towards Minimal Supervision BERT-Based Grammar Error Correction (Student Abstract)." AAAI Conference on Artificial Intelligence, 2020.](https://mlanthology.org/aaai/2020/li2020aaai-minimal/) doi:10.1609/AAAI.V34I10.7202

BibTeX

@inproceedings{li2020aaai-minimal,
  title     = {{Towards Minimal Supervision BERT-Based Grammar Error Correction (Student Abstract)}},
  author    = {Li, Yiyuan and Anastasopoulos, Antonios and Black, Alan W.},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2020},
  pages     = {13859-13860},
  doi       = {10.1609/AAAI.V34I10.7202},
  url       = {https://mlanthology.org/aaai/2020/li2020aaai-minimal/}
}