Sample-Conditioned Hypothesis Stability Sharpens Information-Theoretic Generalization Bounds
Abstract
We present new information-theoretic generalization guarantees through the a novel construction of the "neighboring-hypothesis" matrix and a new family of stability notions termed sample-conditioned hypothesis (SCH) stability. Our approach yields sharper bounds that improve upon previous information-theoretic bounds in various learning scenarios. Notably, these bounds address the limitations of existing information-theoretic bounds in the context of stochastic convex optimization (SCO) problems, as explored in the recent work by Haghifam et al. (2023).
Cite
Text
Wang and Mao. "Sample-Conditioned Hypothesis Stability Sharpens Information-Theoretic Generalization Bounds." Neural Information Processing Systems, 2023.Markdown
[Wang and Mao. "Sample-Conditioned Hypothesis Stability Sharpens Information-Theoretic Generalization Bounds." Neural Information Processing Systems, 2023.](https://mlanthology.org/neurips/2023/wang2023neurips-sampleconditioned/)BibTeX
@inproceedings{wang2023neurips-sampleconditioned,
title = {{Sample-Conditioned Hypothesis Stability Sharpens Information-Theoretic Generalization Bounds}},
author = {Wang, Ziqiao and Mao, Yongyi},
booktitle = {Neural Information Processing Systems},
year = {2023},
url = {https://mlanthology.org/neurips/2023/wang2023neurips-sampleconditioned/}
}