On Representing Linear Programs by Graph Neural Networks

Abstract

Learning to optimize is a rapidly growing area that aims to solve optimization problems or improve existing optimization algorithms using machine learning (ML). In particular, the graph neural network (GNN) is considered a suitable ML model for optimization problems whose variables and constraints are permutation--invariant, for example, the linear program (LP). While the literature has reported encouraging numerical results, this paper establishes the theoretical foundation of applying GNNs to solving LPs. Given any size limit of LPs, we construct a GNN that maps different LPs to different outputs. We show that properly built GNNs can reliably predict feasibility, boundedness, and an optimal solution for each LP in a broad class. Our proofs are based upon the recently--discovered connections between the Weisfeiler--Lehman isomorphism test and the GNN. To validate our results, we train a simple GNN and present its accuracy in mapping LPs to their feasibilities and solutions.

Cite

Text

Chen et al. "On Representing Linear Programs by Graph Neural Networks." International Conference on Learning Representations, 2023.

Markdown

[Chen et al. "On Representing Linear Programs by Graph Neural Networks." International Conference on Learning Representations, 2023.](https://mlanthology.org/iclr/2023/chen2023iclr-representing/)

BibTeX

@inproceedings{chen2023iclr-representing,
  title     = {{On Representing Linear Programs by Graph Neural Networks}},
  author    = {Chen, Ziang and Liu, Jialin and Wang, Xinshang and Yin, Wotao},
  booktitle = {International Conference on Learning Representations},
  year      = {2023},
  url       = {https://mlanthology.org/iclr/2023/chen2023iclr-representing/}
}