PAC-Bayes Generalisation Bounds for Heavy-Tailed Losses Through Supermartingales
Abstract
While PAC-Bayes is now an established learning framework for light-tailed losses (\emph{e.g.}, subgaussian or subexponential), its extension to the case of heavy-tailed losses remains largely uncharted and has attracted a growing interest in recent years. We contribute PAC-Bayes generalisation bounds for heavy-tailed losses under the sole assumption of bounded variance of the loss function. Under that assumption, we extend previous results from \citet{kuzborskij2019efron}. Our key technical contribution is exploiting an extention of Markov's inequality for supermartingales. Our proof technique unifies and extends different PAC-Bayesian frameworks by providing bounds for unbounded martingales as well as bounds for batch and online learning with heavy-tailed losses.
Cite
Text
Haddouche and Guedj. "PAC-Bayes Generalisation Bounds for Heavy-Tailed Losses Through Supermartingales." Transactions on Machine Learning Research, 2023.Markdown
[Haddouche and Guedj. "PAC-Bayes Generalisation Bounds for Heavy-Tailed Losses Through Supermartingales." Transactions on Machine Learning Research, 2023.](https://mlanthology.org/tmlr/2023/haddouche2023tmlr-pacbayes/)BibTeX
@article{haddouche2023tmlr-pacbayes,
title = {{PAC-Bayes Generalisation Bounds for Heavy-Tailed Losses Through Supermartingales}},
author = {Haddouche, Maxime and Guedj, Benjamin},
journal = {Transactions on Machine Learning Research},
year = {2023},
url = {https://mlanthology.org/tmlr/2023/haddouche2023tmlr-pacbayes/}
}