Learning Non-Linearly Separable Boolean Functions with Linear Threshold Unit Trees and Madaline-Style Networks
Abstract
This paper investigates an algorithm for the construction of decisions trees comprised of linear threshold units and also presents a novel algorithm for the learning of nonlinearly separable boolean functions using Madalinestyle networks which are isomorphic to decision trees. The construction of such networks is discussed, and their performance in learning is compared with standard BackPropagation on a sample problem in which many irrelevant attributes are introduced. Littlestone's Winnow algorithm is also explored within this architecture as a means of learning in the presence of many irrelevant attributes. The learning ability of this Madaline-style architecture on non-optimal (larger than necessary) networks is also explored. Introduction We initially examine a non-incremental algorithm that learns binary classification tasks by producing decision trees of linear threshold units (LTU trees). This decision tree bears some similarity to the decision trees produced by ID3 (Quinlan 19...
Cite
Text
Sahami. "Learning Non-Linearly Separable Boolean Functions with Linear Threshold Unit Trees and Madaline-Style Networks." AAAI Conference on Artificial Intelligence, 1993.Markdown
[Sahami. "Learning Non-Linearly Separable Boolean Functions with Linear Threshold Unit Trees and Madaline-Style Networks." AAAI Conference on Artificial Intelligence, 1993.](https://mlanthology.org/aaai/1993/sahami1993aaai-learning/)BibTeX
@inproceedings{sahami1993aaai-learning,
title = {{Learning Non-Linearly Separable Boolean Functions with Linear Threshold Unit Trees and Madaline-Style Networks}},
author = {Sahami, Mehran},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {1993},
pages = {335-341},
url = {https://mlanthology.org/aaai/1993/sahami1993aaai-learning/}
}