Linear-Time Inference in Hierarchical HMMs
Abstract
The hierarchical hidden Markov model (HHMM) is a generalization of the hidden Markov model (HMM) that models sequences with structure at many length/time scales [FST98]. Unfortunately, the original infer- is ence algorithm is rather complicated, and takes the length of the sequence, making it impractical for many domains. In this paper, we show how HHMMs are a special kind of dynamic Bayesian network (DBN), and thereby derive a much simpler inference algorithm, which only takes time. Furthermore, by drawing the connection between HHMMs and DBNs, we enable the application of many stan- dard approximation techniques to further speed up inference.
Cite
Text
Murphy and Paskin. "Linear-Time Inference in Hierarchical HMMs." Neural Information Processing Systems, 2001.Markdown
[Murphy and Paskin. "Linear-Time Inference in Hierarchical HMMs." Neural Information Processing Systems, 2001.](https://mlanthology.org/neurips/2001/murphy2001neurips-lineartime/)BibTeX
@inproceedings{murphy2001neurips-lineartime,
title = {{Linear-Time Inference in Hierarchical HMMs}},
author = {Murphy, Kevin P. and Paskin, Mark A.},
booktitle = {Neural Information Processing Systems},
year = {2001},
pages = {833-840},
url = {https://mlanthology.org/neurips/2001/murphy2001neurips-lineartime/}
}