Improving Knowledge-Aware Dialogue Generation via Knowledge Base Question Answering
Abstract
Neural network models usually suffer from the challenge of incorporating commonsense knowledge into the open-domain dialogue systems. In this paper, we propose a novel knowledge-aware dialogue generation model (called TransDG), which transfers question representation and knowledge matching abilities from knowledge base question answering (KBQA) task to facilitate the utterance understanding and factual knowledge selection for dialogue generation. In addition, we propose a response guiding attention and a multi-step decoding strategy to steer our model to focus on relevant features for response generation. Experiments on two benchmark datasets demonstrate that our model has robust superiority over compared methods in generating informative and fluent dialogues. Our code is available at https://github.com/siat-nlp/TransDG.
Cite
Text
Wang et al. "Improving Knowledge-Aware Dialogue Generation via Knowledge Base Question Answering." AAAI Conference on Artificial Intelligence, 2020. doi:10.1609/AAAI.V34I05.6453Markdown
[Wang et al. "Improving Knowledge-Aware Dialogue Generation via Knowledge Base Question Answering." AAAI Conference on Artificial Intelligence, 2020.](https://mlanthology.org/aaai/2020/wang2020aaai-improving/) doi:10.1609/AAAI.V34I05.6453BibTeX
@inproceedings{wang2020aaai-improving,
title = {{Improving Knowledge-Aware Dialogue Generation via Knowledge Base Question Answering}},
author = {Wang, Jian and Liu, Junhao and Bi, Wei and Liu, Xiaojiang and He, Kejing and Xu, Ruifeng and Yang, Min},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2020},
pages = {9169-9176},
doi = {10.1609/AAAI.V34I05.6453},
url = {https://mlanthology.org/aaai/2020/wang2020aaai-improving/}
}