Scalable Graph Neural Networks via Bidirectional Propagation
Abstract
Graph Neural Networks (GNN) are an emerging field for learning on non-Euclidean data. Recently, there has been increased interest in designing GNN that scales to large graphs. Most existing methods use "graph sampling" or "layer-wise sampling" techniques to reduce training time; However, these methods still suffer from degrading performance and scalability problems when applying to graphs with billions of edges. In this paper, we present GBP, a scalable GNN that utilizes a localized bidirectional propagation process from both the feature vector and the training/testing nodes. Theoretical analysis shows that GBP is the first method that achieves sub-linear time complexity for both the precomputation and the training phases. An extensive empirical study demonstrates that GBP achieves state-of-the-art performance with significantly less training/testing time. Most notably, GBP is able to deliver superior performance on a graph with over 60 million nodes and 1.8 billion edges in less than 2,000 seconds on a single machine.
Cite
Text
Chen et al. "Scalable Graph Neural Networks via Bidirectional Propagation." Neural Information Processing Systems, 2020.Markdown
[Chen et al. "Scalable Graph Neural Networks via Bidirectional Propagation." Neural Information Processing Systems, 2020.](https://mlanthology.org/neurips/2020/chen2020neurips-scalable/)BibTeX
@inproceedings{chen2020neurips-scalable,
title = {{Scalable Graph Neural Networks via Bidirectional Propagation}},
author = {Chen, Ming and Wei, Zhewei and Ding, Bolin and Li, Yaliang and Yuan, Ye and Du, Xiaoyong and Wen, Ji-Rong},
booktitle = {Neural Information Processing Systems},
year = {2020},
url = {https://mlanthology.org/neurips/2020/chen2020neurips-scalable/}
}