Open Problem: Information Complexity of VC Learning

Abstract

Uniform convergence approaches learning by studying the complexity of hypothesis classes. In particular, hypothesis classes with bounded Vapnik-Chervonenkis dimension exhibit strong uniform convergence, that is, any hypothesis in the class has low generalization error. On the other hand, a long line of work studies the information complexity of a learning algorithm, as it is connected to several desired properties, including generalization. We ask whether all VC classes admit a learner with low information complexity which achieves the generalization bounds guaranteed by uniform convergence. Specifically, since we know that this is not possible if we consider proper and consistent learners and measure information complexity in terms of the mutual information (Bassily et al., 2018), we are interested in learners with low information complexity measured in terms of the recently introduced notion of CMI (Steinke and Zakynthinou, 2020). Can we obtain tight bounds on the information complexity of a learning algorithm for a VC class (via CMI), thus exactly retrieving the known generalization bounds implied for this class by uniform convergence?

Cite

Text

Steinke and Zakynthinou. "Open Problem: Information Complexity of VC Learning." Conference on Learning Theory, 2020.

Markdown

[Steinke and Zakynthinou. "Open Problem: Information Complexity of VC Learning." Conference on Learning Theory, 2020.](https://mlanthology.org/colt/2020/steinke2020colt-open/)

BibTeX

@inproceedings{steinke2020colt-open,
  title     = {{Open Problem: Information Complexity of VC Learning}},
  author    = {Steinke, Thomas and Zakynthinou, Lydia},
  booktitle = {Conference on Learning Theory},
  year      = {2020},
  pages     = {3857-3863},
  volume    = {125},
  url       = {https://mlanthology.org/colt/2020/steinke2020colt-open/}
}