The Quotient Bayesian Learning Rule
Abstract
This paper introduces the Quotient Bayesian Learning Rule, an extension of natural-gradient Bayesian updates to probability models that fall outside the exponential family. Building on the observation that many heavy-tailed and otherwise non-exponential distributions arise as marginals of minimal exponential families, we prove that such marginals inherit a unique Fisher–Rao information geometry via the quotient-manifold construction. Exploiting this geometry, we derive the Quotient Natural Gradient algorithm, which takes steepest-descent steps in the well-structured covering space, thereby guaranteeing parameterization-invariant optimization in the target space. Empirical results on the Student-$t$ distribution confirm that our method converges more rapidly and attains higher-quality solutions than previous variants of the Bayesian Learning Rule. These findings position quotient geometry as a unifying tool for efficient and principled inference across a broad class of latent-variable models.
Cite
Text
Lukashchuk et al. "The Quotient Bayesian Learning Rule." Advances in Neural Information Processing Systems, 2025.Markdown
[Lukashchuk et al. "The Quotient Bayesian Learning Rule." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/lukashchuk2025neurips-quotient/)BibTeX
@inproceedings{lukashchuk2025neurips-quotient,
title = {{The Quotient Bayesian Learning Rule}},
author = {Lukashchuk, Mykola and Trésor, Raphaël and Nuijten, Wouter W. L. and Senoz, Ismail and de Vries, Bert},
booktitle = {Advances in Neural Information Processing Systems},
year = {2025},
url = {https://mlanthology.org/neurips/2025/lukashchuk2025neurips-quotient/}
}