Mixed Memory Markov Models: Decomposing Complex Stochastic Processes as Mixtures of Simpler Ones

Abstract

We study Markov models whose state spaces arise from the Cartesian product of two or more discrete random variables. We show how to parameterize the transition matrices of these models as a convex combination—or mixture—of simpler dynamical models. The parameters in these models admit a simple probabilistic interpretation and can be fitted iteratively by an Expectation-Maximization (EM) procedure. We derive a set of generalized Baum-Welch updates for factorial hidden Markov models that make use of this parameterization. We also describe a simple iterative procedure for approximately computing the statistics of the hidden states. Throughout, we give examples where mixed memory models provide a useful representation of complex stochastic processes.

Cite

Text

Saul and Jordan. "Mixed Memory Markov Models: Decomposing Complex Stochastic Processes as Mixtures of Simpler Ones." Machine Learning, 1999. doi:10.1023/A:1007649326333

Markdown

[Saul and Jordan. "Mixed Memory Markov Models: Decomposing Complex Stochastic Processes as Mixtures of Simpler Ones." Machine Learning, 1999.](https://mlanthology.org/mlj/1999/saul1999mlj-mixed/) doi:10.1023/A:1007649326333

BibTeX

@article{saul1999mlj-mixed,
  title     = {{Mixed Memory Markov Models: Decomposing Complex Stochastic Processes as Mixtures of Simpler Ones}},
  author    = {Saul, Lawrence K. and Jordan, Michael I.},
  journal   = {Machine Learning},
  year      = {1999},
  pages     = {75-87},
  doi       = {10.1023/A:1007649326333},
  volume    = {37},
  url       = {https://mlanthology.org/mlj/1999/saul1999mlj-mixed/}
}