Attention as Relation: Learning Supervised Multi-Head Self-Attention for Relation Extraction
Abstract
Joint entity and relation extraction is critical for many natural language processing (NLP) tasks, which has attracted increasing research interest. However, it is still faced with the challenges of identifying the overlapping relation triplets along with the entire entity boundary and detecting the multi-type relations. In this paper, we propose an attention-based joint model, which mainly contains an entity extraction module and a relation detection module, to address the challenges. The key of our model is devising a supervised multi-head self-attention mechanism as the relation detection module to learn the token-level correlation for each relation type separately. With the attention mechanism, our model can effectively identify overlapping relations and flexibly predict the relation type with its corresponding intensity. To verify the effectiveness of our model, we conduct comprehensive experiments on two benchmark datasets. The experimental results demonstrate that our model achieves state-of-the-art performances.
Cite
Text
Liu et al. "Attention as Relation: Learning Supervised Multi-Head Self-Attention for Relation Extraction." International Joint Conference on Artificial Intelligence, 2020. doi:10.24963/IJCAI.2020/524Markdown
[Liu et al. "Attention as Relation: Learning Supervised Multi-Head Self-Attention for Relation Extraction." International Joint Conference on Artificial Intelligence, 2020.](https://mlanthology.org/ijcai/2020/liu2020ijcai-attention/) doi:10.24963/IJCAI.2020/524BibTeX
@inproceedings{liu2020ijcai-attention,
title = {{Attention as Relation: Learning Supervised Multi-Head Self-Attention for Relation Extraction}},
author = {Liu, Jie and Chen, Shaowei and Wang, Bingquan and Zhang, Jiaxin and Li, Na and Xu, Tong},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2020},
pages = {3787-3793},
doi = {10.24963/IJCAI.2020/524},
url = {https://mlanthology.org/ijcai/2020/liu2020ijcai-attention/}
}