Mixed Memory Markov Models

Abstract

We consider how to parameterize Markov models with prohibitively large state spaces. This is done by representing the transition matrix as a convex combination-or mixtureof simpler dynamical models. The parameters in these models admit a simple probabilistic interpretation and can be fitted iteratively by an Expectation-Maximization (EM) procedure. We give examples where these models may be a faithful and/or useful representation of the underlying dynamics. We also derive a set of generalized Baum-Welch updates for hidden Markov models (HMMs) that make use of this parameterization. Because these models decompose the hidden state as the Cartesian product of two or more random variables, they are well suited to the modeling of coupled time series.

Cite

Text

Saul and Jordan. "Mixed Memory Markov Models." Proceedings of the Sixth International Workshop on Artificial Intelligence and Statistics, 1997.

Markdown

[Saul and Jordan. "Mixed Memory Markov Models." Proceedings of the Sixth International Workshop on Artificial Intelligence and Statistics, 1997.](https://mlanthology.org/aistats/1997/saul1997aistats-mixed/)

BibTeX

@inproceedings{saul1997aistats-mixed,
  title     = {{Mixed Memory Markov Models}},
  author    = {Saul, Lawrence K. and Jordan, Michael I.},
  booktitle = {Proceedings of the Sixth International Workshop on Artificial Intelligence and Statistics},
  year      = {1997},
  pages     = {437-444},
  volume    = {R1},
  url       = {https://mlanthology.org/aistats/1997/saul1997aistats-mixed/}
}