Latent Optimal Paths by Gumbel Propagation for Variational Bayesian Dynamic Programming
Abstract
We propose the stochastic optimal path which solves the classical optimal path problem by a probability-softening solution. This unified approach transforms a wide range of DP problems into directed acyclic graphs in which all paths follow a Gibbs distribution. We show the equivalence of the Gibbs distribution to a message-passing algorithm by the properties of the Gumbel distribution and give all the ingredients required for variational Bayesian inference of a latent path, namely Bayesian dynamic programming (BDP). We demonstrate the usage of BDP in the latent space of variational autoencoders (VAEs) and propose the BDP-VAE which captures structured sparse optimal paths as latent variables. This enables end-to-end training for generative tasks in which models rely on unobserved structural information. At last, we validate the behavior of our approach and showcase its applicability in two real-world applications: text-to-speech and singing voice synthesis. Our implementation code is available at https://github.com/XinleiNIU/LatentOptimalPathsBayesianDP.
Cite
Text
Niu et al. "Latent Optimal Paths by Gumbel Propagation for Variational Bayesian Dynamic Programming." International Conference on Machine Learning, 2024.Markdown
[Niu et al. "Latent Optimal Paths by Gumbel Propagation for Variational Bayesian Dynamic Programming." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/niu2024icml-latent/)BibTeX
@inproceedings{niu2024icml-latent,
title = {{Latent Optimal Paths by Gumbel Propagation for Variational Bayesian Dynamic Programming}},
author = {Niu, Xinlei and Walder, Christian and Zhang, Jing and Martin, Charles Patrick},
booktitle = {International Conference on Machine Learning},
year = {2024},
pages = {38316-38343},
volume = {235},
url = {https://mlanthology.org/icml/2024/niu2024icml-latent/}
}