A Question-Focused Multi-Factor Attention Network for Question Answering

Abstract

Neural network models recently proposed for question answering (QA) primarily focus on capturing the passage-question relation. However, they have minimal capability to link relevant facts distributed across multiple sentences which is crucial in achieving deeper understanding, such as performing multi-sentence reasoning, co-reference resolution, etc. They also do not explicitly focus on the question and answer type which often plays a critical role in QA. In this paper, we propose a novel end-to-end question-focused multi-factor attention network for answer extraction. Multi-factor attentive encoding using tensor-based transformation aggregates meaningful facts even when they are located in multiple sentences. To implicitly infer the answer type, we also propose a max-attentional question aggregation mechanism to encode a question vector based on the important words in a question. During prediction, we incorporate sequence-level encoding of the first wh-word and its immediately following word as an additional source of question type information. Our proposed model achieves significant improvements over the best prior state-of-the-art results on three large-scale challenging QA datasets, namely NewsQA, TriviaQA, and SearchQA.

Cite

Text

Kundu and Ng. "A Question-Focused Multi-Factor Attention Network for Question Answering." AAAI Conference on Artificial Intelligence, 2018. doi:10.1609/AAAI.V32I1.12065

Markdown

[Kundu and Ng. "A Question-Focused Multi-Factor Attention Network for Question Answering." AAAI Conference on Artificial Intelligence, 2018.](https://mlanthology.org/aaai/2018/kundu2018aaai-question/) doi:10.1609/AAAI.V32I1.12065

BibTeX

@inproceedings{kundu2018aaai-question,
  title     = {{A Question-Focused Multi-Factor Attention Network for Question Answering}},
  author    = {Kundu, Souvik and Ng, Hwee Tou},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2018},
  pages     = {5828-5835},
  doi       = {10.1609/AAAI.V32I1.12065},
  url       = {https://mlanthology.org/aaai/2018/kundu2018aaai-question/}
}