Multi-Dimension Attention for Multi-Turn Dialog Generation (Student Abstract)

Abstract

We present a generative neural model for open and multi-turn dialog response generation that relies on a multi-dimension attention process to account for the semantic interdependence between the generated words and the conversational history, so as to identify all the words and utterances that influence each generated response. The performance of the model is evaluated on the wide scope DailyDialog corpus and a comparison is made with two other generative neural architectures, using several machine metrics. The results show that the proposed model improves the state of the art for generation accuracy, and its multi-dimension attention allows for a more detailed tracking of the influential words and utterances in the dialog history for response explainability by the dialog history.

Cite

Text

Belainine et al. "Multi-Dimension Attention for Multi-Turn Dialog Generation (Student Abstract)." AAAI Conference on Artificial Intelligence, 2022. doi:10.1609/AAAI.V36I11.21591

Markdown

[Belainine et al. "Multi-Dimension Attention for Multi-Turn Dialog Generation (Student Abstract)." AAAI Conference on Artificial Intelligence, 2022.](https://mlanthology.org/aaai/2022/belainine2022aaai-multi/) doi:10.1609/AAAI.V36I11.21591

BibTeX

@inproceedings{belainine2022aaai-multi,
  title     = {{Multi-Dimension Attention for Multi-Turn Dialog Generation (Student Abstract)}},
  author    = {Belainine, Billal and Sadat, Fatiha and Boukadoum, Mounir},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2022},
  pages     = {12909-12910},
  doi       = {10.1609/AAAI.V36I11.21591},
  url       = {https://mlanthology.org/aaai/2022/belainine2022aaai-multi/}
}