SAM as an Optimal Relaxation of Bayes
Abstract
Sharpness-aware minimization (SAM) and related adversarial deep-learning methods can drastically improve generalization, but their underlying mechanisms are not yet fully understood. Here, we establish SAM as a relaxation of the Bayes objective where the expected negative-loss is replaced by the optimal convex lower bound, obtained by using the so-called Fenchel biconjugate. The connection enables a new Adam-like extension of SAM to automatically obtain reasonable uncertainty estimates, while sometimes also improving its accuracy. By connecting adversarial and Bayesian methods, our work opens a new path to robustness.
Cite
Text
Möllenhoff and Khan. "SAM as an Optimal Relaxation of Bayes." International Conference on Learning Representations, 2023.Markdown
[Möllenhoff and Khan. "SAM as an Optimal Relaxation of Bayes." International Conference on Learning Representations, 2023.](https://mlanthology.org/iclr/2023/mollenhoff2023iclr-sam/)BibTeX
@inproceedings{mollenhoff2023iclr-sam,
title = {{SAM as an Optimal Relaxation of Bayes}},
author = {Möllenhoff, Thomas and Khan, Mohammad Emtiyaz},
booktitle = {International Conference on Learning Representations},
year = {2023},
url = {https://mlanthology.org/iclr/2023/mollenhoff2023iclr-sam/}
}