Characterizations of Learnability for Classes of O, ..., N-Valued Functions

Abstract

We investigate the PAC learnability of classes of 0,…,n-valued functions. For n = 1, it is known that the finiteness of the Vapnik-Chervonenkis dimension is necessary and sufficient for learning. In this paper we present a general scheme for extending the VC-dimension to the case n > 1. Our scheme defines a wide variety of notions of dimension in which several variants of the VC-dimension, previously introduced in the context of learning, appear as special cases. Our main result is a simple condition characterizing the set of notions of dimension whose finiteness is necessary and sufficient for learning. This provides a variety of new tools for determining the learnability of a class of multi-valued functions. Our characterization is also shown to hold in the “robust” variant of PAC model.

Cite

Text

Ben-David et al. "Characterizations of Learnability for Classes of O, ..., N-Valued Functions." Annual Conference on Computational Learning Theory, 1992. doi:10.1145/130385.130423

Markdown

[Ben-David et al. "Characterizations of Learnability for Classes of O, ..., N-Valued Functions." Annual Conference on Computational Learning Theory, 1992.](https://mlanthology.org/colt/1992/bendavid1992colt-characterizations/) doi:10.1145/130385.130423

BibTeX

@inproceedings{bendavid1992colt-characterizations,
  title     = {{Characterizations of Learnability for Classes of O, ..., N-Valued Functions}},
  author    = {Ben-David, Shai and Cesa-Bianchi, Nicolò and Long, Philip M.},
  booktitle = {Annual Conference on Computational Learning Theory},
  year      = {1992},
  pages     = {333-340},
  doi       = {10.1145/130385.130423},
  url       = {https://mlanthology.org/colt/1992/bendavid1992colt-characterizations/}
}