Guiding Time-Varying Generative Models with Natural Gradients on Exponential Family Manifold
Abstract
Optimising probabilistic models is a well-studied field in statistics. However, its connection with the training of generative models remains largely under-explored. In this paper, we show that the evolution of time-varying generative models can be projected onto an exponential family manifold, naturally creating a link between the parameters of a generative model and those of a probabilistic model. We then train the generative model by moving its projection on the manifold according to the natural gradient descent scheme. This approach also allows us to efficiently approximate the natural gradient of the KL divergence without relying on MCMC for intractable models. Furthermore, we propose particle versions of the algorithm, which feature closed-form update rules for any parametric model within the exponential family. Through toy and real-world experiments, we validate the effectiveness of the proposed algorithms. The code of the proposed algorithms can be found at \url{https://github.com/anewgithubname/iNGD}.
Cite
Text
Liu et al. "Guiding Time-Varying Generative Models with Natural Gradients on Exponential Family Manifold." Proceedings of the Forty-first Conference on Uncertainty in Artificial Intelligence, 2025.Markdown
[Liu et al. "Guiding Time-Varying Generative Models with Natural Gradients on Exponential Family Manifold." Proceedings of the Forty-first Conference on Uncertainty in Artificial Intelligence, 2025.](https://mlanthology.org/uai/2025/liu2025uai-guiding/)BibTeX
@inproceedings{liu2025uai-guiding,
title = {{Guiding Time-Varying Generative Models with Natural Gradients on Exponential Family Manifold}},
author = {Liu, Song and Wang, Leyang and Wang, Yakun},
booktitle = {Proceedings of the Forty-first Conference on Uncertainty in Artificial Intelligence},
year = {2025},
pages = {2786-2803},
volume = {286},
url = {https://mlanthology.org/uai/2025/liu2025uai-guiding/}
}