Compact Optimality Verification for Optimization Proxies
Abstract
Recent years have witnessed increasing interest in optimization proxies, i.e., machine learning models that approximate the input-output mapping of parametric optimization problems and return near-optimal feasible solutions. Following recent work by (Nellikkath & Chatzivasileiadis, 2021), this paper reconsiders the optimality verification problem for optimization proxies, i.e., the determination of the worst-case optimality gap over the instance distribution. The paper proposes a compact formulation for optimality verification and a gradient-based primal heuristic that brings significant computational benefits to the original formulation. The compact formulation is also more general and applies to non-convex optimization problems. The benefits of the compact formulation are demonstrated on large-scale DC Optimal Power Flow and knapsack problems.
Cite
Text
Chen et al. "Compact Optimality Verification for Optimization Proxies." International Conference on Machine Learning, 2024.Markdown
[Chen et al. "Compact Optimality Verification for Optimization Proxies." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/chen2024icml-compact/)BibTeX
@inproceedings{chen2024icml-compact,
title = {{Compact Optimality Verification for Optimization Proxies}},
author = {Chen, Wenbo and Zhao, Haoruo and Tanneau, Mathieu and Van Hentenryck, Pascal},
booktitle = {International Conference on Machine Learning},
year = {2024},
pages = {7847-7863},
volume = {235},
url = {https://mlanthology.org/icml/2024/chen2024icml-compact/}
}