MAG-GNN: Reinforcement Learning Boosted Graph Neural Network

Abstract

While Graph Neural Networks (GNNs) recently became powerful tools in graph learning tasks, considerable efforts have been spent on improving GNNs' structural encoding ability. A particular line of work proposed subgraph GNNs that use subgraph information to improve GNNs' expressivity and achieved great success. However, such effectivity sacrifices the efficiency of GNNs by enumerating all possible subgraphs. In this paper, we analyze the necessity of complete subgraph enumeration and show that a model can achieve a comparable level of expressivity by considering a small subset of the subgraphs. We then formulate the identification of the optimal subset as a combinatorial optimization problem and propose Magnetic Graph Neural Network (MAG-GNN), a reinforcement learning (RL) boosted GNN, to solve the problem. Starting with a candidate subgraph set, MAG-GNN employs an RL agent to iteratively update the subgraphs to locate the most expressive set for prediction. This reduces the exponential complexity of subgraph enumeration to the constant complexity of a subgraph search algorithm while keeping good expressivity. We conduct extensive experiments on many datasets, showing that MAG-GNN achieves competitive performance to state-of-the-art methods and even outperforms many subgraph GNNs. We also demonstrate that MAG-GNN effectively reduces the running time of subgraph GNNs.

Cite

Text

Kong et al. "MAG-GNN: Reinforcement Learning Boosted Graph Neural Network." Neural Information Processing Systems, 2023.

Markdown

[Kong et al. "MAG-GNN: Reinforcement Learning Boosted Graph Neural Network." Neural Information Processing Systems, 2023.](https://mlanthology.org/neurips/2023/kong2023neurips-maggnn/)

BibTeX

@inproceedings{kong2023neurips-maggnn,
  title     = {{MAG-GNN: Reinforcement Learning Boosted Graph Neural Network}},
  author    = {Kong, Lecheng and Feng, Jiarui and Liu, Hao and Tao, Dacheng and Chen, Yixin and Zhang, Muhan},
  booktitle = {Neural Information Processing Systems},
  year      = {2023},
  url       = {https://mlanthology.org/neurips/2023/kong2023neurips-maggnn/}
}