A Recurrent Markov State-Space Generative Model for Sequences
Abstract
While the Hidden Markov Model (HMM) is a versatile generative model of sequences capable of performing many exact inferences efficiently, it is not suited for capturing complex long-term structure in the data. Advanced state-space models based on Deep Neural Networks (DNN) overcome this limitation but cannot perform exact inferences. In this article, we present a new generative model for sequences that combines both aspects, the ability to perform exact inferences and the ability to model long-term structure, by augmenting the HMM with a deterministic, continuous state variable modeled through a Recurrent Neural Network. We empirically study the performance of the model on (i) synthetic data comparing it to the HMM, (ii) a supervised learning task in bioinformatics where it outperforms two DNN-based regressors and (iii) in the generative modeling of music where it outperforms many prominent DNN-based generative models.
Cite
Text
Ramachandran et al. "A Recurrent Markov State-Space Generative Model for Sequences." Artificial Intelligence and Statistics, 2019.Markdown
[Ramachandran et al. "A Recurrent Markov State-Space Generative Model for Sequences." Artificial Intelligence and Statistics, 2019.](https://mlanthology.org/aistats/2019/ramachandran2019aistats-recurrent/)BibTeX
@inproceedings{ramachandran2019aistats-recurrent,
title = {{A Recurrent Markov State-Space Generative Model for Sequences}},
author = {Ramachandran, Anand and Lumetta, Steve and Klee, Eric and Chen, Deming},
booktitle = {Artificial Intelligence and Statistics},
year = {2019},
pages = {3070-3079},
volume = {89},
url = {https://mlanthology.org/aistats/2019/ramachandran2019aistats-recurrent/}
}