Minimax Posterior Contraction Rates for Unconstrained Distribution Estimation on $[0,1]^d$ Under Wasserstein Distance
Abstract
We obtain asymptotic minimax optimal posterior contraction rates for estimation of probability distributions on $[0,1]^d$ under the Wasserstein-$p$ metrics using Bayesian Histograms. To the best of our knowledge, our analysis is the first to provide minimax posterior contraction rates for every $p \geq 1$ and problem dimension $d \geq 1$. Our proof technique takes advantage of the conjugacy of the Bayesian Histogram.
Cite
Text
Jacobs et al. "Minimax Posterior Contraction Rates for Unconstrained Distribution Estimation on $[0,1]^d$ Under Wasserstein Distance." Transactions on Machine Learning Research, 2025.Markdown
[Jacobs et al. "Minimax Posterior Contraction Rates for Unconstrained Distribution Estimation on $[0,1]^d$ Under Wasserstein Distance." Transactions on Machine Learning Research, 2025.](https://mlanthology.org/tmlr/2025/jacobs2025tmlr-minimax/)BibTeX
@article{jacobs2025tmlr-minimax,
title = {{Minimax Posterior Contraction Rates for Unconstrained Distribution Estimation on $[0,1]^d$ Under Wasserstein Distance}},
author = {Jacobs, Peter Matthew and Patel, Lekha and Bhattacharya, Anirban and Pati, Debdeep},
journal = {Transactions on Machine Learning Research},
year = {2025},
url = {https://mlanthology.org/tmlr/2025/jacobs2025tmlr-minimax/}
}