Fit the Right NP-Hard Problem: End-to-End Learning of Integer Programming Constraints
Abstract
Bridging logical and algorithmic reasoning with modern machine learning techniques is a fundamental challenge with potentially transformative impact. On the algorithmic side, many NP-Hard problems can be expressed as integer programs, in which the constraints play the role of their ``combinatorial specification''. In this work, we aim to integrate integer programming solvers into neural network architectures by providing loss functions for \emph{both} the objective and the constraints. The resulting end-to-end trainable architectures have the power of jointly extracting features from raw data and of solving a suitable (learned) combinatorial problem with state-of-the-art integer programming solvers. We experimentally validate our approach on artificial datasets created from random constraints, and on solving \textsc{Knapsack} instances from their description in natural language.
Cite
Text
Paulus et al. "Fit the Right NP-Hard Problem: End-to-End Learning of Integer Programming Constraints." NeurIPS 2020 Workshops: LMCA, 2020.Markdown
[Paulus et al. "Fit the Right NP-Hard Problem: End-to-End Learning of Integer Programming Constraints." NeurIPS 2020 Workshops: LMCA, 2020.](https://mlanthology.org/neuripsw/2020/paulus2020neuripsw-fit/)BibTeX
@inproceedings{paulus2020neuripsw-fit,
title = {{Fit the Right NP-Hard Problem: End-to-End Learning of Integer Programming Constraints}},
author = {Paulus, Anselm and Rolinek, Michal and Musil, Vít and Amos, Brandon and Martius, Georg},
booktitle = {NeurIPS 2020 Workshops: LMCA},
year = {2020},
url = {https://mlanthology.org/neuripsw/2020/paulus2020neuripsw-fit/}
}