Learning Recursive Relations with Randomly Selected Small Training Sets
Abstract
We evaluate CRUSTACEAN, an inductive logic programming algorithm that uses inverse implication to induce recursive clauses from examples. This approach is well suited for learning a class of self-recursive clauses, which commonly appear in logic programs, because it searches for common substructures among the examples. However, little evidence exists that inverse implication approaches perform well when given only randomly selected positive and negative examples. We show that CRUSTACEAN learns recursive relations with higher accuracies than GOLEM, yet with reasonable efficiency. We also demonstrate that increasing the number of randomly selected positive and negative examples increases its accuracy on randomly selected test examples, increases the frequency in which it outputs the target relation, and reduces its number of outputs. We also prove a theorem that defines the class of logic programs for which our approach is complete.
Cite
Text
Aha et al. "Learning Recursive Relations with Randomly Selected Small Training Sets." International Conference on Machine Learning, 1994. doi:10.1016/B978-1-55860-335-6.50010-6Markdown
[Aha et al. "Learning Recursive Relations with Randomly Selected Small Training Sets." International Conference on Machine Learning, 1994.](https://mlanthology.org/icml/1994/aha1994icml-learning/) doi:10.1016/B978-1-55860-335-6.50010-6BibTeX
@inproceedings{aha1994icml-learning,
title = {{Learning Recursive Relations with Randomly Selected Small Training Sets}},
author = {Aha, David W. and Lapointe, Stephane and Ling, Charles X. and Matwin, Stan},
booktitle = {International Conference on Machine Learning},
year = {1994},
pages = {12-18},
doi = {10.1016/B978-1-55860-335-6.50010-6},
url = {https://mlanthology.org/icml/1994/aha1994icml-learning/}
}