Improving Generalization with Flat Hilbert Bayesian Inference

Abstract

We introduce Flat Hilbert Bayesian Inference (FHBI), an algorithm designed to enhance generalization in Bayesian inference. Our approach involves an iterative two-step procedure with an adversarial functional perturbation step and a functional descent step within the reproducing kernel Hilbert spaces. This methodology is supported by a theoretical analysis that extends previous findings on generalization ability from finite-dimensional Euclidean spaces to infinite-dimensional functional spaces. To evaluate the effectiveness of FHBI, we conduct comprehensive comparisons against nine baseline methods on the VTAB-1K benchmark, which encompasses 19 diverse datasets across various domains with diverse semantics. Empirical results demonstrate that FHBI consistently outperforms the baselines by notable margins, highlighting its practical efficacy.

Cite

Text

Truong et al. "Improving Generalization with Flat Hilbert Bayesian Inference." Proceedings of the 42nd International Conference on Machine Learning, 2025.

Markdown

[Truong et al. "Improving Generalization with Flat Hilbert Bayesian Inference." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/truong2025icml-improving/)

BibTeX

@inproceedings{truong2025icml-improving,
  title     = {{Improving Generalization with Flat Hilbert Bayesian Inference}},
  author    = {Truong, Tuan and Tran, Quyen and Pham, Ngoc-Quan and Ho, Nhat and Phung, Dinh and Le, Trung},
  booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
  year      = {2025},
  pages     = {60218-60237},
  volume    = {267},
  url       = {https://mlanthology.org/icml/2025/truong2025icml-improving/}
}