Exploring Entity Interactions for Few-Shot Relation Learning (Student Abstract)
Abstract
Few-shot relation learning refers to infer facts for relations with a few observed triples. Existing metric-learning methods mostly neglect entity interactions within and between triples. In this paper, we explore this kind of fine-grained semantic meaning and propose our model TransAM. Specifically, we serialize reference entities and query entities into sequence and apply transformer structure with local-global attention to capture intra- and inter-triple entity interactions. Experiments on two public datasets with 1-shot setting prove the effectiveness of TransAM.
Cite
Text
Liang et al. "Exploring Entity Interactions for Few-Shot Relation Learning (Student Abstract)." AAAI Conference on Artificial Intelligence, 2022. doi:10.1609/AAAI.V36I11.21638Markdown
[Liang et al. "Exploring Entity Interactions for Few-Shot Relation Learning (Student Abstract)." AAAI Conference on Artificial Intelligence, 2022.](https://mlanthology.org/aaai/2022/liang2022aaai-exploring/) doi:10.1609/AAAI.V36I11.21638BibTeX
@inproceedings{liang2022aaai-exploring,
title = {{Exploring Entity Interactions for Few-Shot Relation Learning (Student Abstract)}},
author = {Liang, Yi and Zhao, Shuai and Cheng, Bo and Yin, Yuwei and Yang, Hao},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2022},
pages = {13003-13004},
doi = {10.1609/AAAI.V36I11.21638},
url = {https://mlanthology.org/aaai/2022/liang2022aaai-exploring/}
}