5* Knowledge Graph Embeddings with Projective Transformations

Abstract

Performing link prediction using knowledge graph embedding models has become a popular approach for knowledge graph completion. Such models employ a transformation function that maps nodes via edges into a vector space in order to measure the likelihood of the links. While mapping the individual nodes, the structure of subgraphs is also transformed. Most of the embedding models designed in Euclidean geometry usually support a single transformation type -- often translation or rotation, which is suitable for learning on graphs with small differences in neighboring subgraphs. However, multi-relational knowledge graphs often include multiple subgraph structures in a neighborhood (e.g.~combinations of path and loop structures), which current embedding models do not capture well. To tackle this problem, we propose a novel KGE model 5*E in projective geometry, which supports multiple simultaneous transformations -- specifically inversion, reflection, translation, rotation, and homothety. The model has several favorable theoretical properties and subsumes the existing approaches. It outperforms them on most widely used link prediction benchmarks

Cite

Text

Nayyeri et al. "5* Knowledge Graph Embeddings with Projective Transformations." AAAI Conference on Artificial Intelligence, 2021. doi:10.1609/AAAI.V35I10.17095

Markdown

[Nayyeri et al. "5* Knowledge Graph Embeddings with Projective Transformations." AAAI Conference on Artificial Intelligence, 2021.](https://mlanthology.org/aaai/2021/nayyeri2021aaai-knowledge/) doi:10.1609/AAAI.V35I10.17095

BibTeX

@inproceedings{nayyeri2021aaai-knowledge,
  title     = {{5* Knowledge Graph Embeddings with Projective Transformations}},
  author    = {Nayyeri, Mojtaba and Vahdati, Sahar and Aykul, Can and Lehmann, Jens},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2021},
  pages     = {9064-9072},
  doi       = {10.1609/AAAI.V35I10.17095},
  url       = {https://mlanthology.org/aaai/2021/nayyeri2021aaai-knowledge/}
}