Bootstrapped Representation Learning on Graphs

Abstract

Current state-of-the-art self-supervised learning methods for graph neural networks are based on contrastive learning. As such, they heavily depend on the construction of augmentations and negative examples. Increasing the number of negative pairs improves performance, thereby requiring quadratic computation and memory cost to achieve peak performance. Inspired by BYOL, a recently introduced method for self-supervised learning that does not require negative pairs, we present Bootstrapped Graph Latents, BGRL, a self-supervised graph representation method that gets rid of this potentially quadratic bottleneck. BGRL outperforms or matches the previous unsupervised state-of-the-art results on several established benchmarks. Moreover, it enables the effective usage of graph attentional (GAT) encoders, allowing us to further improve the state of the art, in particular achieving 70.49% Micro-F1 on the PPI dataset using the linear evaluation protocol.

Cite

Text

Thakoor et al. "Bootstrapped Representation Learning on Graphs." ICLR 2021 Workshops: GTRL, 2021.

Markdown

[Thakoor et al. "Bootstrapped Representation Learning on Graphs." ICLR 2021 Workshops: GTRL, 2021.](https://mlanthology.org/iclrw/2021/thakoor2021iclrw-bootstrapped/)

BibTeX

@inproceedings{thakoor2021iclrw-bootstrapped,
  title     = {{Bootstrapped Representation Learning on Graphs}},
  author    = {Thakoor, Shantanu and Tallec, Corentin and Azar, Mohammad Gheshlaghi and Munos, Remi and Veličković, Petar and Valko, Michal},
  booktitle = {ICLR 2021 Workshops: GTRL},
  year      = {2021},
  url       = {https://mlanthology.org/iclrw/2021/thakoor2021iclrw-bootstrapped/}
}