Knowledge-Aware Dialogue Generation with Hybrid Attention (Student Abstract)
Abstract
Using commonsense knowledge to assist dialogue generation is a big step forward for dialogue generation task. However, how to fully utilize commonsense information is always a challenge. Furthermore, the entities generated in the response do not match the information in the post most often. In this paper, we propose a dialogue generation model which uses hybrid attention to better generate rational entities. When a user post is given, the model encodes relevant knowledge graphs from a knowledge base with a graph attention mechanism. Then it will encode the user post and graphs with a co-attention mechanism, which effectively encodes complex related data. Through the above mechanism, we can get a better mutual understanding of post and knowledge. The experimental results show that our model is more effective than the current state-of-the-art model (CCM).
Cite
Text
Zhao et al. "Knowledge-Aware Dialogue Generation with Hybrid Attention (Student Abstract)." AAAI Conference on Artificial Intelligence, 2021. doi:10.1609/AAAI.V35I18.17972Markdown
[Zhao et al. "Knowledge-Aware Dialogue Generation with Hybrid Attention (Student Abstract)." AAAI Conference on Artificial Intelligence, 2021.](https://mlanthology.org/aaai/2021/zhao2021aaai-knowledge/) doi:10.1609/AAAI.V35I18.17972BibTeX
@inproceedings{zhao2021aaai-knowledge,
title = {{Knowledge-Aware Dialogue Generation with Hybrid Attention (Student Abstract)}},
author = {Zhao, Yaru and Cheng, Bo and Zhang, Yingying},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2021},
pages = {15951-15952},
doi = {10.1609/AAAI.V35I18.17972},
url = {https://mlanthology.org/aaai/2021/zhao2021aaai-knowledge/}
}