Minimizing Expected Losses in Perturbation Models with Multidimensional Parametric Min-Cuts
Abstract
We consider the problem of learning perturbation-based probabilistic models by computing and differentiating expected losses. This is a challenging computational problem that has traditionally been tackled using Monte Carlo-based methods. In this work, we show how a generalization of parametric min-cuts can be used to address the same problem, achieving high accuracy of faster than a sampling-based baseline. Utilizing our proposed \textit{Skeleton Method}, we show that we can learn the perturbation model so as to directly minimize expected losses. Experimental results show that this approach offers promise as a new way of training structured prediction models under complex loss functions.
Cite
Text
Kim et al. "Minimizing Expected Losses in Perturbation Models with Multidimensional Parametric Min-Cuts." Conference on Uncertainty in Artificial Intelligence, 2015.Markdown
[Kim et al. "Minimizing Expected Losses in Perturbation Models with Multidimensional Parametric Min-Cuts." Conference on Uncertainty in Artificial Intelligence, 2015.](https://mlanthology.org/uai/2015/kim2015uai-minimizing/)BibTeX
@inproceedings{kim2015uai-minimizing,
title = {{Minimizing Expected Losses in Perturbation Models with Multidimensional Parametric Min-Cuts}},
author = {Kim, Adrian and Jung, Kyomin and Lim, Yongsub and Tarlow, Daniel and Kohli, Pushmeet},
booktitle = {Conference on Uncertainty in Artificial Intelligence},
year = {2015},
pages = {435-443},
url = {https://mlanthology.org/uai/2015/kim2015uai-minimizing/}
}