Bounds for the Minimum Disagreement Problem with Applications to Learning Theory
Abstract
Many studies have been done in the literature on minimum disagreement problems and their connection to Agnostic learning and learning with Malicious errors. We further study these problems and some extensions of them. The classes that are studied in the literature are monomials, monotone monomials, antimonotone monomials, decision lists, halfspaces, neural networks and balls. For some of these classes we improve on the best previously known factors for approximating the minimum disagreement. We also find new bounds for exclusive-or, k -term DNF, k -DNF and multivariate polynomials (Xor of monomials). We then apply the above and some other results from the literature to Agnostic learning and give negative and positive results for Agnostic learning and PAC learning with malicious errors of the above classes.
Cite
Text
Bshouty and Burroughs. "Bounds for the Minimum Disagreement Problem with Applications to Learning Theory." Annual Conference on Computational Learning Theory, 2002. doi:10.1007/3-540-45435-7_19Markdown
[Bshouty and Burroughs. "Bounds for the Minimum Disagreement Problem with Applications to Learning Theory." Annual Conference on Computational Learning Theory, 2002.](https://mlanthology.org/colt/2002/bshouty2002colt-bounds/) doi:10.1007/3-540-45435-7_19BibTeX
@inproceedings{bshouty2002colt-bounds,
title = {{Bounds for the Minimum Disagreement Problem with Applications to Learning Theory}},
author = {Bshouty, Nader H. and Burroughs, Lynn},
booktitle = {Annual Conference on Computational Learning Theory},
year = {2002},
pages = {271-286},
doi = {10.1007/3-540-45435-7_19},
url = {https://mlanthology.org/colt/2002/bshouty2002colt-bounds/}
}