Rényi Divergence in Hidden Markov Models

Abstract

In this paper, we examine the existence of the Rényi divergence between two time invariant hidden Markov models with arbitrary positive initial distributions. By making use of a Markov chain representation of the probability distribution for the hidden Markov model and eigenvalue for the associated Markovian operator, we obtain, under some regularity conditions, convergence of the Rényi divergence. By using this device, we also characterize the Rényi divergence and obtain the Kullback–Leibler divergence as $\alpha \rightarrow 1$ of the Rényi divergence. Several examples, including classical finite state hidden Markov models, Markov switching models, and recurrent neural networks, are given for illustration. Moreover, we develop a non-Monte Carlo method that computes the Rényi divergence of two-state Markov switching models via the underlying invariant probability measure, which is characterized by the Fredholm integral equation.

Cite

Text

Fuh et al. "Rényi Divergence in Hidden Markov Models." Machine Learning, 2025. doi:10.1007/S10994-025-06872-4

Markdown

[Fuh et al. "Rényi Divergence in Hidden Markov Models." Machine Learning, 2025.](https://mlanthology.org/mlj/2025/fuh2025mlj-renyi/) doi:10.1007/S10994-025-06872-4

BibTeX

@article{fuh2025mlj-renyi,
  title     = {{Rényi Divergence in Hidden Markov Models}},
  author    = {Fuh, Cheng-Der and Fuh, Su-Chi and Liu, Yuan-Chen and Wang, Chuan-Ju},
  journal   = {Machine Learning},
  year      = {2025},
  pages     = {232},
  doi       = {10.1007/S10994-025-06872-4},
  volume    = {114},
  url       = {https://mlanthology.org/mlj/2025/fuh2025mlj-renyi/}
}