Distribution Estimation Under the Infinity Norm
Abstract
We present novel bounds for estimating discrete probability distributions under the $\ell_\infty$ norm. These are nearly optimal in various precise senses, including a kind of instance-optimality. Our data-dependent convergence guarantees for the maximum likelihood estimator significantly improve upon the currently known results. A variety of techniques are utilized and innovated upon, including Chernoff-type inequalities and empirical Bernstein bounds. We illustrate our results in synthetic and real-world experiments. Finally, we apply our proposed framework to a basic selective inference problem, where we estimate the most frequent probabilities in a sample.
Cite
Text
Kontorovich and Painsky. "Distribution Estimation Under the Infinity Norm." Journal of Machine Learning Research, 2025.Markdown
[Kontorovich and Painsky. "Distribution Estimation Under the Infinity Norm." Journal of Machine Learning Research, 2025.](https://mlanthology.org/jmlr/2025/kontorovich2025jmlr-distribution/)BibTeX
@article{kontorovich2025jmlr-distribution,
title = {{Distribution Estimation Under the Infinity Norm}},
author = {Kontorovich, Aryeh and Painsky, Amichai},
journal = {Journal of Machine Learning Research},
year = {2025},
pages = {1-30},
volume = {26},
url = {https://mlanthology.org/jmlr/2025/kontorovich2025jmlr-distribution/}
}