Towards Complex Scenarios: Building End-to-End Task-Oriented Dialogue System Across Multiple Knowledge Bases
Abstract
With the success of the sequence-to-sequence model, end-to-end task-oriented dialogue systems (EToDs) have obtained remarkable progress. However, most existing EToDs are limited to single KB settings where dialogues can be supported by a single KB, which is still far from satisfying the requirements of some complex applications (multi-KBs setting). In this work, we first empirically show that the existing single-KB EToDs fail to work on multi-KB settings that require models to reason across various KBs. To solve this issue, we take the first step to consider the multi-KBs scenario in EToDs and introduce a KB-over-KB Heterogeneous Graph Attention Network (KoK-HAN) to facilitate model to reason over multiple KBs. The core module is a triple-connection graph interaction layer that can model different granularity levels of interaction information across different KBs (i.e., intra-KB connection, inter-KB connection and dialogue-KB connection). Experimental results confirm the superiority of our model for multiple KBs reasoning.
Cite
Text
Qin et al. "Towards Complex Scenarios: Building End-to-End Task-Oriented Dialogue System Across Multiple Knowledge Bases." AAAI Conference on Artificial Intelligence, 2023. doi:10.1609/AAAI.V37I11.26581Markdown
[Qin et al. "Towards Complex Scenarios: Building End-to-End Task-Oriented Dialogue System Across Multiple Knowledge Bases." AAAI Conference on Artificial Intelligence, 2023.](https://mlanthology.org/aaai/2023/qin2023aaai-complex/) doi:10.1609/AAAI.V37I11.26581BibTeX
@inproceedings{qin2023aaai-complex,
title = {{Towards Complex Scenarios: Building End-to-End Task-Oriented Dialogue System Across Multiple Knowledge Bases}},
author = {Qin, Libo and Li, Zhouyang and Yu, Qiying and Wang, Lehan and Che, Wanxiang},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2023},
pages = {13483-13491},
doi = {10.1609/AAAI.V37I11.26581},
url = {https://mlanthology.org/aaai/2023/qin2023aaai-complex/}
}