A General Lower Bound on the Number of Examples Needed for Learning
Abstract
We prove a lower bound of Ω((1/ɛ)ln(1/δ)+VCdim(C)/ɛ) on the number of random examples required for distribution-free learning of a concept class C, where VCdim(C) is the Vapnik-Chervonenkis dimension and ɛ and δ are the accuracy and confidence parameters. This improves the previous best lower bound of Ω((1/ɛ)ln(1/δ)+VCdim(C)) and comes close to the known general upper bound of O((1/ɛ)ln(1/δ)+(VCdim(C)/ɛ)ln(1/ɛ)) for consistent algorithms. We show that for many interesting concept classes, including kCNF and kDNF, our bound is actually tight to within a constant factor.
Cite
Text
Ehrenfeucht et al. "A General Lower Bound on the Number of Examples Needed for Learning." Annual Conference on Computational Learning Theory, 1988. doi:10.1016/0890-5401(89)90002-3Markdown
[Ehrenfeucht et al. "A General Lower Bound on the Number of Examples Needed for Learning." Annual Conference on Computational Learning Theory, 1988.](https://mlanthology.org/colt/1988/ehrenfeucht1988colt-general/) doi:10.1016/0890-5401(89)90002-3BibTeX
@inproceedings{ehrenfeucht1988colt-general,
title = {{A General Lower Bound on the Number of Examples Needed for Learning}},
author = {Ehrenfeucht, Andrzej and Haussler, David and Kearns, Michael J. and Valiant, Leslie G.},
booktitle = {Annual Conference on Computational Learning Theory},
year = {1988},
pages = {139-154},
doi = {10.1016/0890-5401(89)90002-3},
url = {https://mlanthology.org/colt/1988/ehrenfeucht1988colt-general/}
}