Controlling Multiple Errors Simultaneously with a PAC-Bayes Bound

Abstract

Current PAC-Bayes generalisation bounds are restricted to scalar metrics of performance, such as the loss or error rate. However, one ideally wants more information-rich certificates that control the entire distribution of possible outcomes, such as the distribution of the test loss in regression, or the probabilities of different mis-classifications. We provide the first PAC-Bayes bound capable of providing such rich information by bounding the Kullback-Leibler divergence between the empirical and true probabilities of a set of $M$ error types, which can either be discretized loss values for regression, or the elements of the confusion matrix (or a partition thereof) for classification. We transform our bound into a differentiable training objective. Our bound is especially useful in cases where the severity of different mis-classifications may change over time; existing PAC-Bayes bounds can only bound a particular pre-decided weighting of the error types. In contrast our bound implicitly controls all uncountably many weightings simultaneously.

Cite

Text

Adams et al. "Controlling Multiple Errors Simultaneously with a PAC-Bayes Bound." Neural Information Processing Systems, 2024. doi:10.52202/079017-0172

Markdown

[Adams et al. "Controlling Multiple Errors Simultaneously with a PAC-Bayes Bound." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/adams2024neurips-controlling/) doi:10.52202/079017-0172

BibTeX

@inproceedings{adams2024neurips-controlling,
  title     = {{Controlling Multiple Errors Simultaneously with a PAC-Bayes Bound}},
  author    = {Adams, Reuben and Shawe-Taylor, John and Guedj, Benjamin},
  booktitle = {Neural Information Processing Systems},
  year      = {2024},
  doi       = {10.52202/079017-0172},
  url       = {https://mlanthology.org/neurips/2024/adams2024neurips-controlling/}
}