Learning Hidden Quantum Markov Models

Abstract

Hidden Quantum Markov Models (HQMMs) can be thought of as quantum probabilistic graphical models that can model sequential data. We extend previous work on HQMMs with three contributions: (1) we show how classical hidden Markov models (HMMs) can be simulated on a quantum circuit, (2) we reformulate HQMMs by relaxing the constraints for modeling HMMs on quantum circuits, and (3) we present a learning algorithm to estimate the parameters of an HQMM from data. While our algorithm requires further optimization to handle larger datasets, we are able to evaluate our algorithm using several synthetic datasets. We show that on HQMM generated data, our algorithm learns HQMMs with the same number of hidden states and predictive accuracy as the true HQMMs, while HMMs learned with the Baum-Welch algorithm require more states to match the predictive accuracy.

Cite

Text

Srinivasan et al. "Learning Hidden Quantum Markov Models." International Conference on Artificial Intelligence and Statistics, 2018.

Markdown

[Srinivasan et al. "Learning Hidden Quantum Markov Models." International Conference on Artificial Intelligence and Statistics, 2018.](https://mlanthology.org/aistats/2018/srinivasan2018aistats-learning/)

BibTeX

@inproceedings{srinivasan2018aistats-learning,
  title     = {{Learning Hidden Quantum Markov Models}},
  author    = {Srinivasan, Siddarth and Gordon, Geoffrey J. and Boots, Byron},
  booktitle = {International Conference on Artificial Intelligence and Statistics},
  year      = {2018},
  pages     = {1979-1987},
  url       = {https://mlanthology.org/aistats/2018/srinivasan2018aistats-learning/}
}