Topic-to-Essay Generation with Neural Networks
Abstract
We focus on essay generation, which is a challenging task that generates a paragraph-level text with multiple topics.Progress towards understanding different topics and expressing diversity in this task requires more powerful generators and richer training and evaluation resources. To address this, we develop a multi-topic aware long short-term memory (MTA-LSTM) network.In this model, we maintain a novel multi-topic coverage vector, which learns the weight of each topic and is sequentially updated during the decoding process.Afterwards this vector is fed to an attention model to guide the generator.Moreover, we automatically construct two paragraph-level Chinese essay corpora, 305,000 essay paragraphs and 55,000 question-and-answer pairs.Empirical results show that our approach obtains much better BLEU score compared to various baselines.Furthermore, human judgment shows that MTA-LSTM has the ability to generate essays that are not only coherent but also closely related to the input topics.
Cite
Text
Feng et al. "Topic-to-Essay Generation with Neural Networks." International Joint Conference on Artificial Intelligence, 2018. doi:10.24963/IJCAI.2018/567Markdown
[Feng et al. "Topic-to-Essay Generation with Neural Networks." International Joint Conference on Artificial Intelligence, 2018.](https://mlanthology.org/ijcai/2018/feng2018ijcai-topic/) doi:10.24963/IJCAI.2018/567BibTeX
@inproceedings{feng2018ijcai-topic,
title = {{Topic-to-Essay Generation with Neural Networks}},
author = {Feng, Xiaocheng and Liu, Ming and Liu, Jiahao and Qin, Bing and Sun, Yibo and Liu, Ting},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2018},
pages = {4078-4084},
doi = {10.24963/IJCAI.2018/567},
url = {https://mlanthology.org/ijcai/2018/feng2018ijcai-topic/}
}