Estimation of Markov Chain via Rank-Constrained Likelihood

Abstract

This paper studies the estimation of low-rank Markov chains from empirical trajectories. We propose a non-convex estimator based on rank-constrained likelihood maximization. Statistical upper bounds are provided for the Kullback-Leiber divergence and the $\ell_2$ risk between the estimator and the true transition matrix. The estimator reveals a compressed state space of the Markov chain. We also develop a novel DC (difference of convex function) programming algorithm to tackle the rank-constrained non-smooth optimization problem. Convergence results are established. Experiments show that the proposed estimator achieves better empirical performance than other popular approaches.

Cite

Text

Li et al. "Estimation of Markov Chain via Rank-Constrained Likelihood." International Conference on Machine Learning, 2018.

Markdown

[Li et al. "Estimation of Markov Chain via Rank-Constrained Likelihood." International Conference on Machine Learning, 2018.](https://mlanthology.org/icml/2018/li2018icml-estimation/)

BibTeX

@inproceedings{li2018icml-estimation,
  title     = {{Estimation of Markov Chain via Rank-Constrained Likelihood}},
  author    = {Li, Xudong and Wang, Mengdi and Zhang, Anru},
  booktitle = {International Conference on Machine Learning},
  year      = {2018},
  pages     = {3033-3042},
  volume    = {80},
  url       = {https://mlanthology.org/icml/2018/li2018icml-estimation/}
}