Query-Reduction Networks for Question Answering

Abstract

In this paper, we study the problem of question answering when reasoning over multiple facts is required. We propose Query-Reduction Network (QRN), a variant of Recurrent Neural Network (RNN) that effectively handles both short-term (local) and long-term (global) sequential dependencies to reason over multiple facts. QRN considers the context sentences as a sequence of state-changing triggers, and reduces the original query to a more informed query as it observes each trigger (context sentence) through time. Our experiments show that QRN produces the state-of-the-art results in bAbI QA and dialog tasks, and in a real goal-oriented dialog dataset. In addition, QRN formulation allows parallelization on RNN's time axis, saving an order of magnitude in time complexity for training and inference.

Cite

Text

Seo et al. "Query-Reduction Networks for Question Answering." International Conference on Learning Representations, 2017.

Markdown

[Seo et al. "Query-Reduction Networks for Question Answering." International Conference on Learning Representations, 2017.](https://mlanthology.org/iclr/2017/seo2017iclr-query/)

BibTeX

@inproceedings{seo2017iclr-query,
  title     = {{Query-Reduction Networks for Question Answering}},
  author    = {Seo, Min Joon and Min, Sewon and Farhadi, Ali and Hajishirzi, Hannaneh},
  booktitle = {International Conference on Learning Representations},
  year      = {2017},
  url       = {https://mlanthology.org/iclr/2017/seo2017iclr-query/}
}