JUMP-Means: Small-Variance Asymptotics for Markov Jump Processes
Abstract
Markov jump processes (MJPs) are used to model a wide range of phenomenon from disease progression to RNA path folding. However, existing methods suffer from a number of shortcomings: degenerate trajectories in the case of ML estimation of parametric models and poor inferential performance in the case of nonparametric models. We take a small-variance asymptotics (SVA) approach to overcome these limitations. We derive the small-variance asymptotics for parametric and nonparametric MJPs for both directly observed and hidden state models. In the parametric case we obtain a novel objective function which leads to non-degenerate trajectories. To derive the nonparametric version we introduce the gamma-gamma process, a novel extension to the gamma-exponential process. We propose algorithms for each of these formulations, which we call \emphJUMP-means. Our experiments demonstrate that JUMP-means is competitive with or outperforms widely used MJP inference approaches in terms of both speed and reconstruction accuracy.
Cite
Text
Huggins et al. "JUMP-Means: Small-Variance Asymptotics for Markov Jump Processes." International Conference on Machine Learning, 2015.Markdown
[Huggins et al. "JUMP-Means: Small-Variance Asymptotics for Markov Jump Processes." International Conference on Machine Learning, 2015.](https://mlanthology.org/icml/2015/huggins2015icml-jumpmeans/)BibTeX
@inproceedings{huggins2015icml-jumpmeans,
title = {{JUMP-Means: Small-Variance Asymptotics for Markov Jump Processes}},
author = {Huggins, Jonathan and Narasimhan, Karthik and Saeedi, Ardavan and Mansinghka, Vikash},
booktitle = {International Conference on Machine Learning},
year = {2015},
pages = {693-701},
volume = {37},
url = {https://mlanthology.org/icml/2015/huggins2015icml-jumpmeans/}
}