Inductive Learning from Good Examples
Abstract
We study what kind of data may ease the computational complexity of learning of Horn clause theories (in Gold's paradigm) and Boolean functions (in PAC-learning paradigm). We give several definitions of good data (basic and generative representative sets), and develop data-driven algorithms that learn faster from good examples, and degenerate to learn in the limit from the "worst" possible examples. We show that Horn clause theories, kterm DNF and general DNF Boolean functions are polynomially learnable from generative representative presentations. 1 Introduction In any inductive learning model, how data of the target theory are supplied to the learning programs is a crucial assumption. Identification in the limit [ Gold, 1967 ] assumes that the series of examples is an admissible enumeration of all (positive and/or negative) examples of the target concept, and requires the learning algorithm to produce a correct hypothesis in some finite time. However, the computational time and th...
Cite
Text
Ling. "Inductive Learning from Good Examples." International Joint Conference on Artificial Intelligence, 1991.Markdown
[Ling. "Inductive Learning from Good Examples." International Joint Conference on Artificial Intelligence, 1991.](https://mlanthology.org/ijcai/1991/ling1991ijcai-inductive/)BibTeX
@inproceedings{ling1991ijcai-inductive,
title = {{Inductive Learning from Good Examples}},
author = {Ling, Charles X.},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {1991},
pages = {751-756},
url = {https://mlanthology.org/ijcai/1991/ling1991ijcai-inductive/}
}