PyEPO: A PyTorch-Based End-to-End Predict-Then-Optimize Library with Linear Objective Function

Abstract

In many practical settings, some parameters of an optimization problem may be a priori unknown but can be estimated from historical data. Recently, end-to-end predict-then-optimize has emerged as an attractive alternative to the two-stage approach of separately fitting a predictive model for the unknown parameters, then optimizing. In this work, we present the PyEPO package, a PyTorch-based end-to-end predict-then-optimize library in Python for linear and integer programming. It provides two base algorithms: the first is based on the convex surrogate loss function from the seminal work of Elmachtoub & Grigas (2021), and the second is based on the differentiable black-box solver approach of Vlastelica et al. (2019). PyEPO provides a simple interface for the definition of new optimization problems, the implementation of state-of-the-art predict-then-optimize training algorithms, the use of custom neural network architectures, and the comparison of end-to-end approaches with the two-stage approach.

Cite

Text

Tang and Khalil. "PyEPO: A PyTorch-Based End-to-End Predict-Then-Optimize Library with Linear Objective Function." NeurIPS 2022 Workshops: OPT, 2022.

Markdown

[Tang and Khalil. "PyEPO: A PyTorch-Based End-to-End Predict-Then-Optimize Library with Linear Objective Function." NeurIPS 2022 Workshops: OPT, 2022.](https://mlanthology.org/neuripsw/2022/tang2022neuripsw-pyepo/)

BibTeX

@inproceedings{tang2022neuripsw-pyepo,
  title     = {{PyEPO: A PyTorch-Based End-to-End Predict-Then-Optimize Library with Linear Objective Function}},
  author    = {Tang, Bo and Khalil, Elias Boutros},
  booktitle = {NeurIPS 2022 Workshops: OPT},
  year      = {2022},
  url       = {https://mlanthology.org/neuripsw/2022/tang2022neuripsw-pyepo/}
}