Rception: Wide and Deep Interaction Networks for Machine Reading Comprehension (Student Abstract)
Abstract
Most of models for machine reading comprehension (MRC) usually focus on recurrent neural networks (RNNs) and attention mechanism, though convolutional neural networks (CNNs) are also involved for time efficiency. However, little attention has been paid to leverage CNNs and RNNs in MRC. For a deeper understanding, humans sometimes need local information for short phrases, sometimes need global context for long passages. In this paper, we propose a novel architecture, i.e., Rception, to capture and leverage both local deep information and global wide context. It fuses different kinds of networks and hyper-parameters horizontally rather than simply stacking them layer by layer vertically. Experiments on the Stanford Question Answering Dataset (SQuAD) show that our proposed architecture achieves good performance.
Cite
Text
Zhang and Wang. "Rception: Wide and Deep Interaction Networks for Machine Reading Comprehension (Student Abstract)." AAAI Conference on Artificial Intelligence, 2020. doi:10.1609/AAAI.V34I10.7266Markdown
[Zhang and Wang. "Rception: Wide and Deep Interaction Networks for Machine Reading Comprehension (Student Abstract)." AAAI Conference on Artificial Intelligence, 2020.](https://mlanthology.org/aaai/2020/zhang2020aaai-rception/) doi:10.1609/AAAI.V34I10.7266BibTeX
@inproceedings{zhang2020aaai-rception,
title = {{Rception: Wide and Deep Interaction Networks for Machine Reading Comprehension (Student Abstract)}},
author = {Zhang, Xuanyu and Wang, Zhichun},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2020},
pages = {13987-13988},
doi = {10.1609/AAAI.V34I10.7266},
url = {https://mlanthology.org/aaai/2020/zhang2020aaai-rception/}
}