Improved Approximations for Hard Graph Problems Using Predictions
Abstract
We design improved approximation algorithms for NP-hard graph problems by incorporating predictions (e.g., learned from past data). Our prediction model builds upon and extends the $\varepsilon$-prediction framework by Cohen-Addad, d’Orsi, Gupta, Lee, and Panigrahi (NeurIPS 2024). We consider an edge-based version of this model, where each edge provides two bits of information, corresponding to predictions about whether each of its endpoints belong to an optimal solution. Even with weak predictions where each bit is only $\varepsilon$-correlated with the true solution, this information allows us to break approximation barriers in the standard setting. We develop algorithms with improved approximation ratios for MaxCut, Vertex Cover, Set Cover, and Maximum Independent Set problems (among others). Across these problems, our algorithms share a unifying theme, where we separately satisfy constraints related to high degree vertices (using predictions) and low-degree vertices (without using predictions) and carefully combine the answers.
Cite
Text
Aamand et al. "Improved Approximations for Hard Graph Problems Using Predictions." Proceedings of the 42nd International Conference on Machine Learning, 2025.Markdown
[Aamand et al. "Improved Approximations for Hard Graph Problems Using Predictions." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/aamand2025icml-improved/)BibTeX
@inproceedings{aamand2025icml-improved,
title = {{Improved Approximations for Hard Graph Problems Using Predictions}},
author = {Aamand, Anders and Chen, Justin Y. and Gollapudi, Siddharth and Silwal, Sandeep and Wu, Hao},
booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
year = {2025},
pages = {73-101},
volume = {267},
url = {https://mlanthology.org/icml/2025/aamand2025icml-improved/}
}