Redundant Noisy Attributes, Attribute Errors, and Linear-Threshold Learning Using Winnow
Abstract
First we study the performance of the linear-threshold algorithm, Winnow, in the presence of attribute errors in the data available for learning. We study how such errors affect bounds on the number of mistakes that Winnow will make. We allow the errors to be generated by an adversary. In the presence of such errors the bounds for Winnow retain the logarithmic dependence on the number of relevant attributes that they have in the noise free case. There is an additional additive term proportional to a weighted sum of the number of errors occurring in each relevant attribute during a sequence of trials. We next examine probabilistic mistake bounds that can be obtained by making stronger assumptions about the instances seen by the learner. We are particularly interested in models of noisy redundant information. The situation that we consider is motivated by the case where there are two prototypical instances each described by a number of attributes, one prototype with classification 1 and the other with classification 0. The instances seen by the learner are noisy versions of these prototypes. The noise affecting irrelevant attributes can be arbitrary (even generated by an adversary), but we assume that the relevant attributes are affected by random noise independently of one another. We obtain bounds (holding with high probability) that grow in proportion to the number of trials. In a typical case the constant of proportionality that we obtain decreases exponentially with the number of independent relevant attributes.
Cite
Text
Littlestone. "Redundant Noisy Attributes, Attribute Errors, and Linear-Threshold Learning Using Winnow." Annual Conference on Computational Learning Theory, 1991. doi:10.1016/B978-1-55860-213-7.50017-1Markdown
[Littlestone. "Redundant Noisy Attributes, Attribute Errors, and Linear-Threshold Learning Using Winnow." Annual Conference on Computational Learning Theory, 1991.](https://mlanthology.org/colt/1991/littlestone1991colt-redundant/) doi:10.1016/B978-1-55860-213-7.50017-1BibTeX
@inproceedings{littlestone1991colt-redundant,
title = {{Redundant Noisy Attributes, Attribute Errors, and Linear-Threshold Learning Using Winnow}},
author = {Littlestone, Nick},
booktitle = {Annual Conference on Computational Learning Theory},
year = {1991},
pages = {147-156},
doi = {10.1016/B978-1-55860-213-7.50017-1},
url = {https://mlanthology.org/colt/1991/littlestone1991colt-redundant/}
}