Max-Margin Infinite Hidden Markov Models

Abstract

Infinite hidden Markov models (iHMMs) are nonparametric Bayesian extensions of hidden Markov models (HMMs) with an infinite number of states. Though flexible in describing sequential data, the generative formulation of iHMMs could limit their discriminative ability in sequential prediction tasks. Our paper introduces max-margin infinite HMMs (M2iHMMs), new infinite HMMs that explore the max-margin principle for discriminative learning. By using the theory of Gibbs classifiers and data augmentation, we develop efficient beam sampling algorithms without making restricting mean-field assumptions or truncated approximation. For single variate classification, M2iHMMs reduce to a new formulation of DP mixtures of max-margin machines. Empirical results on synthetic and real data sets show that our methods obtain superior performance than other competitors in both single variate classification and sequential prediction tasks.

Cite

Text

Zhang et al. "Max-Margin Infinite Hidden Markov Models." International Conference on Machine Learning, 2014.

Markdown

[Zhang et al. "Max-Margin Infinite Hidden Markov Models." International Conference on Machine Learning, 2014.](https://mlanthology.org/icml/2014/zhang2014icml-maxmargin/)

BibTeX

@inproceedings{zhang2014icml-maxmargin,
  title     = {{Max-Margin Infinite Hidden Markov Models}},
  author    = {Zhang, Aonan and Zhu, Jun and Zhang, Bo},
  booktitle = {International Conference on Machine Learning},
  year      = {2014},
  pages     = {315-323},
  volume    = {32},
  url       = {https://mlanthology.org/icml/2014/zhang2014icml-maxmargin/}
}