Are Graph Neural Networks Optimal Approximation Algorithms?

Abstract

In this work we design graph neural network architectures that capture optimalapproximation algorithms for a large class of combinatorial optimization problems,using powerful algorithmic tools from semidefinite programming (SDP). Concretely, we prove that polynomial-sized message-passing algorithms can representthe most powerful polynomial time algorithms for Max Constraint SatisfactionProblems assuming the Unique Games Conjecture. We leverage this result toconstruct efficient graph neural network architectures, OptGNN, that obtain high quality approximate solutions on landmark combinatorial optimization problemssuch as Max-Cut, Min-Vertex-Cover, and Max-3-SAT. Our approach achievesstrong empirical results across a wide range of real-world and synthetic datasetsagainst solvers and neural baselines. Finally, we take advantage of OptGNN’sability to capture convex relaxations to design an algorithm for producing boundson the optimal solution from the learned embeddings of OptGNN.

Cite

Text

Yau et al. "Are Graph Neural Networks Optimal Approximation Algorithms?." Neural Information Processing Systems, 2024. doi:10.52202/079017-2328

Markdown

[Yau et al. "Are Graph Neural Networks Optimal Approximation Algorithms?." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/yau2024neurips-graph/) doi:10.52202/079017-2328

BibTeX

@inproceedings{yau2024neurips-graph,
  title     = {{Are Graph Neural Networks Optimal Approximation Algorithms?}},
  author    = {Yau, Morris and Karalias, Nikolaos and Lu, Eric and Xu, Jessica and Jegelka, Stefanie},
  booktitle = {Neural Information Processing Systems},
  year      = {2024},
  doi       = {10.52202/079017-2328},
  url       = {https://mlanthology.org/neurips/2024/yau2024neurips-graph/}
}