Hierarchical Recurrent Attention Network for Response Generation

Abstract

We study multi-turn response generation in chatbots where a response is generated according to a conversation context.   Existing work has modeled the hierarchy of the context, but does not pay enough attention to the fact that words and utterances in the context are differentially important. As a result, they may lose important information in context and generate irrelevant responses. We propose a hierarchical recurrent attention network (HRAN) to model both the hierarchy and the importance variance in a unified framework. In HRAN, a hierarchical attention mechanism attends to important parts within and among utterances with word level attention and utterance level attention respectively.

Cite

Text

Xing et al. "Hierarchical Recurrent Attention Network for Response Generation." AAAI Conference on Artificial Intelligence, 2018. doi:10.1609/AAAI.V32I1.11965

Markdown

[Xing et al. "Hierarchical Recurrent Attention Network for Response Generation." AAAI Conference on Artificial Intelligence, 2018.](https://mlanthology.org/aaai/2018/xing2018aaai-hierarchical/) doi:10.1609/AAAI.V32I1.11965

BibTeX

@inproceedings{xing2018aaai-hierarchical,
  title     = {{Hierarchical Recurrent Attention Network for Response Generation}},
  author    = {Xing, Chen and Wu, Yu and Wu, Wei and Huang, Yalou and Zhou, Ming},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2018},
  pages     = {5610-5617},
  doi       = {10.1609/AAAI.V32I1.11965},
  url       = {https://mlanthology.org/aaai/2018/xing2018aaai-hierarchical/}
}