Unlabelled Sample Compression Schemes for Intersection-Closed Classes and Extremal Classes

Abstract

The sample compressibility of concept classes plays an important role in learning theory, as a sufficient condition for PAC learnability, and more recently as an avenue for robust generalisation in adaptive data analysis. Whether compression schemes of size $O(d)$ must necessarily exist for all classes of VC dimension $d$ is unknown, but conjectured to be true by Warmuth. Recently Chalopin, Chepoi, Moran, and Warmuth (2018) gave a beautiful unlabelled sample compression scheme of size VC dimension for all maximum classes: classes that meet the Sauer-Shelah-Perles Lemma with equality. They also offered a counterexample to compression schemes based on a promising approach known as corner peeling. In this paper we simplify and extend their proof technique to deal with so-called extremal classes of VC dimension $d$ which contain maximum classes of VC dimension $d-1$. A criterion is given which would imply that all extremal classes admit unlabelled compression schemes of size $d$. We also prove that all intersection-closed classes with VC dimension $d$ admit unlabelled compression schemes of size at most $11d$.

Cite

Text

Rubinstein and Rubinstein. "Unlabelled Sample Compression Schemes for Intersection-Closed Classes and Extremal Classes." Neural Information Processing Systems, 2022.

Markdown

[Rubinstein and Rubinstein. "Unlabelled Sample Compression Schemes for Intersection-Closed Classes and Extremal Classes." Neural Information Processing Systems, 2022.](https://mlanthology.org/neurips/2022/rubinstein2022neurips-unlabelled/)

BibTeX

@inproceedings{rubinstein2022neurips-unlabelled,
  title     = {{Unlabelled Sample Compression Schemes for Intersection-Closed Classes and Extremal Classes}},
  author    = {Rubinstein, Joachim and Rubinstein, Benjamin I.},
  booktitle = {Neural Information Processing Systems},
  year      = {2022},
  url       = {https://mlanthology.org/neurips/2022/rubinstein2022neurips-unlabelled/}
}