Gaussian Embedding of Linked Documents from a Pretrained Semantic Space

Abstract

Gaussian Embedding of Linked Documents (GELD) is a new method that embeds linked documents (e.g., citation networks) onto a pretrained semantic space (e.g., a set of word embeddings). We formulate the problem in such a way that we model each document as a Gaussian distribution in the word vector space. We design a generative model that combines both words and links in a consistent way. Leveraging the variance of a document allows us to model the uncertainty related to word and link generation. In most cases, our method outperforms state-of-the-art methods when using our document vectors as features for usual downstream tasks. In particular, GELD achieves better accuracy in classification and link prediction on Cora and Dblp. In addition, we demonstrate qualitatively the convenience of several properties of our method. We provide the implementation of GELD and the evaluation datasets to the community (https://github.com/AntoineGourru/DNEmbedding).

Cite

Text

Gourru et al. "Gaussian Embedding of Linked Documents from a Pretrained Semantic Space." International Joint Conference on Artificial Intelligence, 2020. doi:10.24963/IJCAI.2020/541

Markdown

[Gourru et al. "Gaussian Embedding of Linked Documents from a Pretrained Semantic Space." International Joint Conference on Artificial Intelligence, 2020.](https://mlanthology.org/ijcai/2020/gourru2020ijcai-gaussian/) doi:10.24963/IJCAI.2020/541

BibTeX

@inproceedings{gourru2020ijcai-gaussian,
  title     = {{Gaussian Embedding of Linked Documents from a Pretrained Semantic Space}},
  author    = {Gourru, Antoine and Velcin, Julien and Jacques, Julien},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2020},
  pages     = {3912-3918},
  doi       = {10.24963/IJCAI.2020/541},
  url       = {https://mlanthology.org/ijcai/2020/gourru2020ijcai-gaussian/}
}