Incorporating Structured Commonsense Knowledge in Story Completion

Abstract

The ability to select an appropriate story ending is the first step towards perfect narrative comprehension. Story ending prediction requires not only the explicit clues within the context, but also the implicit knowledge (such as commonsense) to construct a reasonable and consistent story. However, most previous approaches do not explicitly use background commonsense knowledge. We present a neural story ending selection model that integrates three types of information: narrative sequence, sentiment evolution and commonsense knowledge. Experiments show that our model outperforms state-ofthe-art approaches on a public dataset, ROCStory Cloze Task (Mostafazadeh et al. 2017), and the performance gain from adding the additional commonsense knowledge is significant.

Cite

Text

Chen et al. "Incorporating Structured Commonsense Knowledge in Story Completion." AAAI Conference on Artificial Intelligence, 2019. doi:10.1609/AAAI.V33I01.33016244

Markdown

[Chen et al. "Incorporating Structured Commonsense Knowledge in Story Completion." AAAI Conference on Artificial Intelligence, 2019.](https://mlanthology.org/aaai/2019/chen2019aaai-incorporating/) doi:10.1609/AAAI.V33I01.33016244

BibTeX

@inproceedings{chen2019aaai-incorporating,
  title     = {{Incorporating Structured Commonsense Knowledge in Story Completion}},
  author    = {Chen, Jiaao and Chen, Jianshu and Yu, Zhou},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2019},
  pages     = {6244-6251},
  doi       = {10.1609/AAAI.V33I01.33016244},
  url       = {https://mlanthology.org/aaai/2019/chen2019aaai-incorporating/}
}