LazyGNN: Large-Scale Graph Neural Networks via Lazy Propagation
Abstract
Recent works have demonstrated the benefits of capturing long-distance dependency in graphs by deeper graph neural networks (GNNs). But deeper GNNs suffer from the long-lasting scalability challenge due to the neighborhood explosion problem in large-scale graphs. In this work, we propose to capture long-distance dependency in graphs by shallower models instead of deeper models, which leads to a much more efficient model, LazyGNN, for graph representation learning. Moreover, we demonstrate that LazyGNN is compatible with existing scalable approaches (such as sampling methods) for further accelerations through the development of mini-batch LazyGNN. Comprehensive experiments demonstrate its superior prediction performance and scalability on large-scale benchmarks. The implementation of LazyGNN is available at https: //github.com/RXPHD/Lazy_GNN.
Cite
Text
Xue et al. "LazyGNN: Large-Scale Graph Neural Networks via Lazy Propagation." International Conference on Machine Learning, 2023.Markdown
[Xue et al. "LazyGNN: Large-Scale Graph Neural Networks via Lazy Propagation." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/xue2023icml-lazygnn/)BibTeX
@inproceedings{xue2023icml-lazygnn,
title = {{LazyGNN: Large-Scale Graph Neural Networks via Lazy Propagation}},
author = {Xue, Rui and Han, Haoyu and Torkamani, Mohamadali and Pei, Jian and Liu, Xiaorui},
booktitle = {International Conference on Machine Learning},
year = {2023},
pages = {38926-38937},
volume = {202},
url = {https://mlanthology.org/icml/2023/xue2023icml-lazygnn/}
}