Vacillatory and BC Learning on Noisy Data

Abstract

In an earlier paper, Frank Stephan introduced a form of noisy data which nonetheless uniquely determines the true data: correct information occurs infinitely often while incorrect information occurs only finitely often. The present paper considers the effects of this form of noise on vacillatory and behaviorally correct learning of grammars — both from positive data alone and from informant (positive and negative data). For learning from informant, the noise, in effect destroys negative data. Various noisy-data hierarchies are exhibited, which, in some cases, are known to collapse when there is no noise. Noisy behaviorally correct learning is shown to obey a very strong “subset principle”. It is shown, in many cases, how much power is needed to overcome the effects of noise. For example, the best we can do to simulate, in the presence of noise, the noise-free, no mind change cases takes infinitely many mind changes. One technical result is proved by a priority argument.

Cite

Text

Case et al. "Vacillatory and BC Learning on Noisy Data." International Conference on Algorithmic Learning Theory, 1996. doi:10.1007/3-540-61863-5_53

Markdown

[Case et al. "Vacillatory and BC Learning on Noisy Data." International Conference on Algorithmic Learning Theory, 1996.](https://mlanthology.org/alt/1996/case1996alt-vacillatory/) doi:10.1007/3-540-61863-5_53

BibTeX

@inproceedings{case1996alt-vacillatory,
  title     = {{Vacillatory and BC Learning on Noisy Data}},
  author    = {Case, John and Jain, Sanjay and Stephan, Frank},
  booktitle = {International Conference on Algorithmic Learning Theory},
  year      = {1996},
  pages     = {285-298},
  doi       = {10.1007/3-540-61863-5_53},
  url       = {https://mlanthology.org/alt/1996/case1996alt-vacillatory/}
}