Unsupervised Inductive Graph-Level Representation Learning via Graph-Graph Proximity

Abstract

We introduce a novel approach to graph-level representation learning, which is to embed an entire graph into a vector space where the embeddings of two graphs preserve their graph-graph proximity. Our approach, UGraphEmb, is a general framework that provides a novel means to performing graph-level embedding in a completely unsupervised and inductive manner. The learned neural network can be considered as a function that receives any graph as input, either seen or unseen in the training set, and transforms it into an embedding. A novel graph-level embedding generation mechanism called Multi-Scale Node Attention (MSNA), is proposed. Experiments on five real graph datasets show that UGraphEmb achieves competitive accuracy in the tasks of graph classification, similarity ranking, and graph visualization.

Cite

Text

Bai et al. "Unsupervised Inductive Graph-Level Representation Learning via Graph-Graph Proximity." International Joint Conference on Artificial Intelligence, 2019. doi:10.24963/IJCAI.2019/275

Markdown

[Bai et al. "Unsupervised Inductive Graph-Level Representation Learning via Graph-Graph Proximity." International Joint Conference on Artificial Intelligence, 2019.](https://mlanthology.org/ijcai/2019/bai2019ijcai-unsupervised/) doi:10.24963/IJCAI.2019/275

BibTeX

@inproceedings{bai2019ijcai-unsupervised,
  title     = {{Unsupervised Inductive Graph-Level Representation Learning via Graph-Graph Proximity}},
  author    = {Bai, Yunsheng and Ding, Hao and Qiao, Yang and Marinovic, Agustin and Gu, Ken and Chen, Ting and Sun, Yizhou and Wang, Wei},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2019},
  pages     = {1988-1994},
  doi       = {10.24963/IJCAI.2019/275},
  url       = {https://mlanthology.org/ijcai/2019/bai2019ijcai-unsupervised/}
}