Pre-Training with Scientific Text Improves Educational Question Generation (Student Abstract)
Abstract
With the boom of digital educational materials and scalable e-learning systems, the potential for realising AI-assisted personalised learning has skyrocketed. In this landscape, the automatic generation of educational questions will play a key role, enabling scalable self-assessment when a global population is manoeuvring their personalised learning journeys. We develop EduQG, a novel educational question generation model built by adapting a large language model. Our initial experiments demonstrate that EduQG can produce superior educational questions by pre-training on scientific text.
Cite
Text
Muse et al. "Pre-Training with Scientific Text Improves Educational Question Generation (Student Abstract)." AAAI Conference on Artificial Intelligence, 2023. doi:10.1609/AAAI.V37I13.27004Markdown
[Muse et al. "Pre-Training with Scientific Text Improves Educational Question Generation (Student Abstract)." AAAI Conference on Artificial Intelligence, 2023.](https://mlanthology.org/aaai/2023/muse2023aaai-pre/) doi:10.1609/AAAI.V37I13.27004BibTeX
@inproceedings{muse2023aaai-pre,
title = {{Pre-Training with Scientific Text Improves Educational Question Generation (Student Abstract)}},
author = {Muse, Hamze and Bulathwela, Sahan and Yilmaz, Emine},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2023},
pages = {16288-16289},
doi = {10.1609/AAAI.V37I13.27004},
url = {https://mlanthology.org/aaai/2023/muse2023aaai-pre/}
}