Randomized Optimum Models for Structured Prediction

Abstract

One approach to modeling structured discrete data is to describe the probability of states via an energy function and Gibbs distribution. A recurring difficulty in these models is the computation of the partition function, which may require an intractable sum. However, in many such models, the mode can be found efficiently even when the partition function is unavailable. Recent work on Perturb-and-MAP (PM) models (Papandreou and Yuille, 2011) has exploited this discrepancy to approximate the Gibbs distribution for Markov random fields (MRFs). Here, we explore a broader class of models, called Randomized Optimum Models (RandOMs), which include PM as a special case. This new class of models encompasses not only MRFs, but also other models that have intractable partition functions yet permit efficient mode-finding, such as those based on bipartite matchings, shortest paths, or connected components in a graph. We develop likelihood-based learning algorithms for RandOMs, which, empirical results indicate, can produce better models than PM.

Cite

Text

Tarlow et al. "Randomized Optimum Models for Structured Prediction." Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, 2012.

Markdown

[Tarlow et al. "Randomized Optimum Models for Structured Prediction." Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, 2012.](https://mlanthology.org/aistats/2012/tarlow2012aistats-randomized/)

BibTeX

@inproceedings{tarlow2012aistats-randomized,
  title     = {{Randomized Optimum Models for Structured Prediction}},
  author    = {Tarlow, Daniel and Adams, Ryan and Zemel, Richard},
  booktitle = {Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics},
  year      = {2012},
  pages     = {1221-1229},
  volume    = {22},
  url       = {https://mlanthology.org/aistats/2012/tarlow2012aistats-randomized/}
}