Infinite Markov-Switching Maximum Entropy Discrimination Machines

Abstract

In this paper, we present a method that combines the merits of Bayesian nonparametrics, specifically stick-breaking priors, and large-margin kernel machines in the context of sequential data classification. The proposed model postulates a set of (theoretically) infinite interdependent large-margin classifiers as model components, that robustly capture local nonlinearity of complex data. The postulated large-margin classifiers are connected in the context of a Markov-switching construction that allows for capturing complex temporal dynamics in the modeled datasets. Appropriate stick-breaking priors are imposed over the component switching mechanism of our model to allow for data-driven determination of the optimal number of component large-margin classifiers, under a standard nonparametric Bayesian inference scheme. Efficient model training is performed under the maximum entropy discrimination (MED) framework, which integrates the large-margin principle with Bayesian posterior inference. We evaluate our method using several real-world datasets, and compare it to state-of-the-art alternatives.

Cite

Text

Chatzis. "Infinite Markov-Switching Maximum Entropy Discrimination Machines." International Conference on Machine Learning, 2013.

Markdown

[Chatzis. "Infinite Markov-Switching Maximum Entropy Discrimination Machines." International Conference on Machine Learning, 2013.](https://mlanthology.org/icml/2013/chatzis2013icml-infinite/)

BibTeX

@inproceedings{chatzis2013icml-infinite,
  title     = {{Infinite Markov-Switching Maximum Entropy Discrimination Machines}},
  author    = {Chatzis, Sotirios},
  booktitle = {International Conference on Machine Learning},
  year      = {2013},
  pages     = {729-737},
  volume    = {28},
  url       = {https://mlanthology.org/icml/2013/chatzis2013icml-infinite/}
}