Gibbs Max-Margin Topic Models with Fast Sampling Algorithms

Abstract

Existing max-margin supervised topic models rely on an iterative procedure to solve multiple latent SVM subproblems with additional mean-field assumptions on the desired posterior distributions. This paper presents Gibbs max-margin supervised topic models by minimizing an expected margin loss, an upper bound of the existing margin loss derived from an expected prediction rule. By introducing augmented variables, we develop simple and fast Gibbs sampling algorithms with no restricting assumptions and no need to solve SVM subproblems for both classification and regression. Empirical results demonstrate significant improvements on time efficiency. The classification performance is also significantly improved over competitors.

Cite

Text

Zhu et al. "Gibbs Max-Margin Topic Models with Fast Sampling Algorithms." International Conference on Machine Learning, 2013.

Markdown

[Zhu et al. "Gibbs Max-Margin Topic Models with Fast Sampling Algorithms." International Conference on Machine Learning, 2013.](https://mlanthology.org/icml/2013/zhu2013icml-gibbs/)

BibTeX

@inproceedings{zhu2013icml-gibbs,
  title     = {{Gibbs Max-Margin Topic Models with Fast Sampling Algorithms}},
  author    = {Zhu, Jun and Chen, Ning and Perkins, Hugh and Zhang, Bo},
  booktitle = {International Conference on Machine Learning},
  year      = {2013},
  pages     = {124-132},
  volume    = {28},
  url       = {https://mlanthology.org/icml/2013/zhu2013icml-gibbs/}
}