Reducing Predict and Optimize to Convex Feasibility
Abstract
Numerous applications in operations research and computer science require a combination of prediction and optimization -- use historical data to predict the parameters of an optimization problem, and solve the optimization problem to output a decision. Addressing these two challenges independently results in the \emph{predict-then-optimize} problem. This approach can result in discrepancies between the prediction error, minimized during training, and the ultimate objective of minimizing the decision error. Consequently, recent work has focused on the \emph{predict and optimize} (PO) framework which focuses on training an end-to-end model from the data to the decisions. We focus on linear programs (LPs) within the PO framework where the main challenge is handling the non-differentiability of LPs. For a linear prediction model, we present a novel reduction from PO to a convex feasibility problem. This reduction enables us to use alternating projections onto convex sets for solving the PO problem, resulting in a computationally-efficient and theoretically principled algorithm. Finally, we validate the effectiveness of our approach on synthetic shortest path and fractional knapsack problems, demonstrating improved performance compared to the prior work.
Cite
Text
Mishra and Vaswani. "Reducing Predict and Optimize to Convex Feasibility." NeurIPS 2023 Workshops: OPT, 2023.Markdown
[Mishra and Vaswani. "Reducing Predict and Optimize to Convex Feasibility." NeurIPS 2023 Workshops: OPT, 2023.](https://mlanthology.org/neuripsw/2023/mishra2023neuripsw-reducing/)BibTeX
@inproceedings{mishra2023neuripsw-reducing,
title = {{Reducing Predict and Optimize to Convex Feasibility}},
author = {Mishra, Saurabh kumar and Vaswani, Sharan},
booktitle = {NeurIPS 2023 Workshops: OPT},
year = {2023},
url = {https://mlanthology.org/neuripsw/2023/mishra2023neuripsw-reducing/}
}