An Elementary Concentration Bound for Gibbs Measures Arising in Statistical Learning Theory

Abstract

We present an elementary concentration bound for Gibbs measures whose log-likelihood is a function of the empirical risk. This bound controls the distance between samples from the (random) Gibbs measure and the minimizers of the population risk function. This bound is a generalization of a recent inequality developed by Ramsay et al., 2024. As a corollary, we obtain sample complexity bounds and bounds on the inverse temperature so that the samples are within a prescribed error of the population value. The latter bound on the inverse temperature is essentially sharp. We demonstrate our work on three canonical classes of examples: classification of two component mixture models, robust regression, and spiked matrix and tensor models.

Cite

Text

Ramsay et al. "An Elementary Concentration Bound for Gibbs Measures Arising in Statistical Learning Theory." Transactions on Machine Learning Research, 2025.

Markdown

[Ramsay et al. "An Elementary Concentration Bound for Gibbs Measures Arising in Statistical Learning Theory." Transactions on Machine Learning Research, 2025.](https://mlanthology.org/tmlr/2025/ramsay2025tmlr-elementary/)

BibTeX

@article{ramsay2025tmlr-elementary,
  title     = {{An Elementary Concentration Bound for Gibbs Measures Arising in Statistical Learning Theory}},
  author    = {Ramsay, Kelly and Jagannath, Aukosh and Chenouri, Shojaeddin},
  journal   = {Transactions on Machine Learning Research},
  year      = {2025},
  url       = {https://mlanthology.org/tmlr/2025/ramsay2025tmlr-elementary/}
}