Learning Convex QP Relaxations for Structured Prediction
Abstract
We introduce a new large margin approach to discriminative training of intractable discrete graphical models. Our approach builds on a convex quadratic programming relaxation of the MAP inference problem. The model parameters are trained directly within this restricted class of energy functions so as to optimize the predictions on the training data. We address the issue of how to parameterize the resulting model and point out its relation to existing approaches. The primary motivation behind our use of the QP relaxation is its computational efficiency; yet, empirically, its predictive accuracy compares favorably to more expensive approaches. This makes it an appealing choice for many practical tasks.
Cite
Text
Jancsary et al. "Learning Convex QP Relaxations for Structured Prediction." International Conference on Machine Learning, 2013.Markdown
[Jancsary et al. "Learning Convex QP Relaxations for Structured Prediction." International Conference on Machine Learning, 2013.](https://mlanthology.org/icml/2013/jancsary2013icml-learning/)BibTeX
@inproceedings{jancsary2013icml-learning,
title = {{Learning Convex QP Relaxations for Structured Prediction}},
author = {Jancsary, Jeremy and Nowozin, Sebastian and Rother, Carsten},
booktitle = {International Conference on Machine Learning},
year = {2013},
pages = {915-923},
volume = {28},
url = {https://mlanthology.org/icml/2013/jancsary2013icml-learning/}
}