Recurrent Hidden Semi-Markov Model

Abstract

Segmentation and labeling of high dimensional time series data has wide applications in behavior understanding and medical diagnosis. Due to the difficulty in obtaining the label information for high dimensional data, realizing this objective in an unsupervised way is highly desirable. Hidden Semi-Markov Model (HSMM) is a classical tool for this problem. However, existing HSMM and its variants has simple conditional assumptions of observations, thus the ability to capture the nonlinear and complex dynamics within segments is limited. To tackle this limitation, we propose to incorporate the Recurrent Neural Network (RNN) to model the generative process in HSMM, resulting the Recurrent HSMM (R-HSMM). To accelerate the inference while preserving accuracy, we designed a structure encoding function to mimic the exact inference. By generalizing the penalty method to distribution space, we are able to train the model and the encoding function simultaneously. Empirical results show that the proposed R-HSMM achieves the state-of-the-art performances on both synthetic and real-world datasets.

Cite

Text

Dai et al. "Recurrent Hidden Semi-Markov Model." International Conference on Learning Representations, 2017.

Markdown

[Dai et al. "Recurrent Hidden Semi-Markov Model." International Conference on Learning Representations, 2017.](https://mlanthology.org/iclr/2017/dai2017iclr-recurrent/)

BibTeX

@inproceedings{dai2017iclr-recurrent,
  title     = {{Recurrent Hidden Semi-Markov Model}},
  author    = {Dai, Hanjun and Dai, Bo and Zhang, Yan-Ming and Li, Shuang and Song, Le},
  booktitle = {International Conference on Learning Representations},
  year      = {2017},
  url       = {https://mlanthology.org/iclr/2017/dai2017iclr-recurrent/}
}