Lifted Inference for Convex Quadratic Programs

Abstract

Symmetry is the essential element of lifted inferencethat has recently demonstrated the possibility to perform very efficient inference in highly-connected, but symmetric probabilistic models. This raises the question, whether this holds for optimization problems in general.Here we show that for a large classof optimization methods this is actually the case.Specifically, we introduce the concept of fractionalsymmetries of convex quadratic programs (QPs),which lie at the heart of many AI and machine learning approaches,and exploit it to lift, i.e., to compress QPs.These lifted QPs can then be tackled with the usual optimization toolbox (off-the-shelf solvers, cutting plane algorithms,stochastic gradients etc.). If the original QP exhibitssymmetry, then the lifted one will generallybe more compact, and hence more efficient to solve.

Cite

Text

Mladenov et al. "Lifted Inference for Convex Quadratic Programs." AAAI Conference on Artificial Intelligence, 2017. doi:10.1609/AAAI.V31I1.10841

Markdown

[Mladenov et al. "Lifted Inference for Convex Quadratic Programs." AAAI Conference on Artificial Intelligence, 2017.](https://mlanthology.org/aaai/2017/mladenov2017aaai-lifted/) doi:10.1609/AAAI.V31I1.10841

BibTeX

@inproceedings{mladenov2017aaai-lifted,
  title     = {{Lifted Inference for Convex Quadratic Programs}},
  author    = {Mladenov, Martin and Kleinhans, Leonard and Kersting, Kristian},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2017},
  pages     = {2350-2356},
  doi       = {10.1609/AAAI.V31I1.10841},
  url       = {https://mlanthology.org/aaai/2017/mladenov2017aaai-lifted/}
}