Effective Deep Memory Networks for Distant Supervised Relation Extraction
Abstract
Distant supervised relation extraction (RE) has been an effective way of finding novel relational facts from text without labeled training data. Typically it can be formalized as a multi-instance multi-label problem. In this paper, we introduce a novel neural approach for distant supervised (RE) with specific focus on attention mechanisms. Unlike the feature-based logistic regression model and compositional neural models such as CNN, our approach includes two major attention-based memory components, which is capable of explicitly capturing the importance of each context word for modeling the representation of the entity pair, as well as the intrinsic dependencies between relations. Such importance degree and dependency relationship are calculated with multiple computational layers, each of which is a neural attention model over an external memory. Experiment on real-world datasets shows that our approach performs significantly and consistently better than various baselines.
Cite
Text
Feng et al. "Effective Deep Memory Networks for Distant Supervised Relation Extraction." International Joint Conference on Artificial Intelligence, 2017. doi:10.24963/IJCAI.2017/559Markdown
[Feng et al. "Effective Deep Memory Networks for Distant Supervised Relation Extraction." International Joint Conference on Artificial Intelligence, 2017.](https://mlanthology.org/ijcai/2017/feng2017ijcai-effective/) doi:10.24963/IJCAI.2017/559BibTeX
@inproceedings{feng2017ijcai-effective,
title = {{Effective Deep Memory Networks for Distant Supervised Relation Extraction}},
author = {Feng, Xiaocheng and Guo, Jiang and Qin, Bing and Liu, Ting and Liu, Yongjie},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2017},
pages = {4002-4008},
doi = {10.24963/IJCAI.2017/559},
url = {https://mlanthology.org/ijcai/2017/feng2017ijcai-effective/}
}