Majorization for CRFs and Latent Likelihoods
Abstract
The partition function plays a key role in probabilistic modeling including conditional random fields, graphical models, and maximum likelihood estimation. To optimize partition functions, this article introduces a quadratic variational upper bound. This inequality facilitates majorization methods: optimization of complicated functions through the iterative solution of simpler sub-problems. Such bounds remain efficient to compute even when the partition function involves a graphical model (with small tree-width) or in latent likelihood settings. For large-scale problems, low-rank versions of the bound are provided and outperform LBFGS as well as first-order methods. Several learning applications are shown and reduce to fast and convergent update rules. Experimental results show advantages over state-of-the-art optimization methods.
Cite
Text
Jebara and Choromanska. "Majorization for CRFs and Latent Likelihoods." Neural Information Processing Systems, 2012.Markdown
[Jebara and Choromanska. "Majorization for CRFs and Latent Likelihoods." Neural Information Processing Systems, 2012.](https://mlanthology.org/neurips/2012/jebara2012neurips-majorization/)BibTeX
@inproceedings{jebara2012neurips-majorization,
title = {{Majorization for CRFs and Latent Likelihoods}},
author = {Jebara, Tony and Choromanska, Anna},
booktitle = {Neural Information Processing Systems},
year = {2012},
pages = {557-565},
url = {https://mlanthology.org/neurips/2012/jebara2012neurips-majorization/}
}