Contextualized Non-Local Neural Networks for Sequence Learning
Abstract
Recently, a large number of neural mechanisms and models have been proposed for sequence learning, of which selfattention, as exemplified by the Transformer model, and graph neural networks (GNNs) have attracted much attention. In this paper, we propose an approach that combines and draws on the complementary strengths of these two methods. Specifically, we propose contextualized non-local neural networks (CN3), which can both dynamically construct a task-specific structure of a sentence and leverage rich local dependencies within a particular neighbourhood.Experimental results on ten NLP tasks in text classification, semantic matching, and sequence labelling show that our proposed model outperforms competitive baselines and discovers task-specific dependency structures, thus providing better interpretability to users.
Cite
Text
Liu et al. "Contextualized Non-Local Neural Networks for Sequence Learning." AAAI Conference on Artificial Intelligence, 2019. doi:10.1609/AAAI.V33I01.33016762Markdown
[Liu et al. "Contextualized Non-Local Neural Networks for Sequence Learning." AAAI Conference on Artificial Intelligence, 2019.](https://mlanthology.org/aaai/2019/liu2019aaai-contextualized/) doi:10.1609/AAAI.V33I01.33016762BibTeX
@inproceedings{liu2019aaai-contextualized,
title = {{Contextualized Non-Local Neural Networks for Sequence Learning}},
author = {Liu, Pengfei and Chang, Shuaichen and Huang, Xuanjing and Tang, Jian and Cheung, Jackie Chi Kit},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2019},
pages = {6762-6769},
doi = {10.1609/AAAI.V33I01.33016762},
url = {https://mlanthology.org/aaai/2019/liu2019aaai-contextualized/}
}