Answering Complex Queries in Knowledge Graphs with Bidirectional Sequence Encoders

Abstract

Representation learning for knowledge graphs (KGs) has focused on the problem of answering simple link prediction queries. In this work we address the more ambitious challenge of predicting the answers of conjunctive queries with multiple missing entities. We propose Bidirectional Query Embedding (BiQE), a method that embeds conjunctive queries with models based on bi-directional attention mechanisms. Contrary to prior work, bidirectional self-attention can capture interactions among all the elements of a query graph. We introduce two new challenging datasets for studying conjunctive query inference and conduct experiments on several benchmark datasets that demonstrate BiQE significantly outperforms state of the art baselines.

Cite

Text

Kotnis et al. "Answering Complex Queries in Knowledge Graphs with Bidirectional Sequence Encoders." AAAI Conference on Artificial Intelligence, 2021. doi:10.1609/AAAI.V35I6.16630

Markdown

[Kotnis et al. "Answering Complex Queries in Knowledge Graphs with Bidirectional Sequence Encoders." AAAI Conference on Artificial Intelligence, 2021.](https://mlanthology.org/aaai/2021/kotnis2021aaai-answering/) doi:10.1609/AAAI.V35I6.16630

BibTeX

@inproceedings{kotnis2021aaai-answering,
  title     = {{Answering Complex Queries in Knowledge Graphs with Bidirectional Sequence Encoders}},
  author    = {Kotnis, Bhushan and Lawrence, Carolin and Niepert, Mathias},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2021},
  pages     = {4968-4977},
  doi       = {10.1609/AAAI.V35I6.16630},
  url       = {https://mlanthology.org/aaai/2021/kotnis2021aaai-answering/}
}