Neural Methods for Logical Reasoning over Knowledge Graphs
Abstract
Reasoning is a fundamental problem for computers and deeply studied in Artificial Intelligence. In this paper, we specifically focus on answering multi-hop logical queries on Knowledge Graphs (KGs). This is a complicated task because, in real world scenarios, the graphs tend to be large and incomplete. Most previous works have been unable to create models that accept full First-Order Logical (FOL) queries, which includes negative queries, and have only been able to process a limited set of query structures. Additionally, most methods present logic operators that can only perform the logical operation they are made for. We introduce a set of models that use Neural Networks to create one-point vector embeddings to answer the queries. The versatility of neural networks allows the framework to handle FOL queries with Conjunction, Disjunction and Negation operators. We demonstrate experimentally the performance of our models through extensive experimentation on well-known benchmarking datasets. Besides having more versatile operators, the models achieve a 10% relative increase over best performing state of the art and more than 30% over the original method based on single-point vector embeddings.
Cite
Text
Amayuelas et al. "Neural Methods for Logical Reasoning over Knowledge Graphs." International Conference on Learning Representations, 2022.Markdown
[Amayuelas et al. "Neural Methods for Logical Reasoning over Knowledge Graphs." International Conference on Learning Representations, 2022.](https://mlanthology.org/iclr/2022/amayuelas2022iclr-neural/)BibTeX
@inproceedings{amayuelas2022iclr-neural,
title = {{Neural Methods for Logical Reasoning over Knowledge Graphs}},
author = {Amayuelas, Alfonso and Zhang, Shuai and Rao, Xi Susie and Zhang, Ce},
booktitle = {International Conference on Learning Representations},
year = {2022},
url = {https://mlanthology.org/iclr/2022/amayuelas2022iclr-neural/}
}