Sample Complexity for Distributionally Robust Learning Under Chi-Square Divergence

Abstract

This paper investigates the sample complexity of learning a distributionally robust predictor under a particular distributional shift based on $\chi^2$-divergence, which is well known for its computational feasibility and statistical properties. We demonstrate that any hypothesis class $\mathcal{H}$ with finite VC dimension is distributionally robustly learnable. Moreover, we show that when the perturbation size is smaller than a constant, finite VC dimension is also necessary for distributionally robust learning by deriving a lower bound of sample complexity in terms of VC dimension.

Cite

Text

Zhou and Liu. "Sample Complexity for Distributionally Robust Learning Under Chi-Square Divergence." Journal of Machine Learning Research, 2023.

Markdown

[Zhou and Liu. "Sample Complexity for Distributionally Robust Learning Under Chi-Square Divergence." Journal of Machine Learning Research, 2023.](https://mlanthology.org/jmlr/2023/zhou2023jmlr-sample/)

BibTeX

@article{zhou2023jmlr-sample,
  title     = {{Sample Complexity for Distributionally Robust Learning Under Chi-Square Divergence}},
  author    = {Zhou, Zhengyu and Liu, Weiwei},
  journal   = {Journal of Machine Learning Research},
  year      = {2023},
  pages     = {1-27},
  volume    = {24},
  url       = {https://mlanthology.org/jmlr/2023/zhou2023jmlr-sample/}
}