EntQA: Entity Linking as Question Answering
Abstract
A conventional approach to entity linking is to first find mentions in a given document and then infer their underlying entities in the knowledge base. A well-known limitation of this approach is that it requires finding mentions without knowing their entities, which is unnatural and difficult. We present a new model that does not suffer from this limitation called $\textbf{EntQA}$, which stands for $\mbox{\textbf{Ent}ity}$ linking as $\mbox{\textbf{Q}uestion}$ $\mbox{\textbf{A}nswering}$. EntQA first proposes candidate entities with a fast retrieval module, and then scrutinizes the document to find mentions of each candidate with a powerful reader module. Our approach combines progress in entity linking with that in open-domain question answering and capitalizes on pretrained models for dense entity retrieval and reading comprehension. Unlike in previous works, we do not rely on a mention-candidates dictionary or large-scale weak supervision. EntQA achieves strong results on the GERBIL benchmarking platform.
Cite
Text
Zhang et al. "EntQA: Entity Linking as Question Answering." International Conference on Learning Representations, 2022.Markdown
[Zhang et al. "EntQA: Entity Linking as Question Answering." International Conference on Learning Representations, 2022.](https://mlanthology.org/iclr/2022/zhang2022iclr-entqa/)BibTeX
@inproceedings{zhang2022iclr-entqa,
title = {{EntQA: Entity Linking as Question Answering}},
author = {Zhang, Wenzheng and Hua, Wenyue and Stratos, Karl},
booktitle = {International Conference on Learning Representations},
year = {2022},
url = {https://mlanthology.org/iclr/2022/zhang2022iclr-entqa/}
}