Oblivious Decision Trees, Graphs, and Top-Down Pruning
Abstract
We describe a supervised learning algorithm, EODG, that uses mutual information to build an oblivious decision tree. The tree is then converted to an Oblivious read-Once Decision Graph (OODG) by merging nodes at the same level of the tree. For domains that are appropriate for both decision trees and OODGs, performance is approximately the same as that of C4.5, but the number of nodes in the OODG is much smaller. The merging phase that converts the oblivious decision tree to an OODG provides a new way of dealing with the replication problem and a new pruning mechanism that works top down starting from the root. The pruning mechanism is well suited for finding symmetries and aids in recovering from splits on irrelevant features that may happen during the tree construction. 1 Introduction Decision trees provide a hypothesis space for supervised machine learning algorithms that is well suited for many datasets encountered in the real world (Breiman, Friedman, Olshen & Stone 1984, Quinlan...
Cite
Text
Kohavi and Li. "Oblivious Decision Trees, Graphs, and Top-Down Pruning." International Joint Conference on Artificial Intelligence, 1995.Markdown
[Kohavi and Li. "Oblivious Decision Trees, Graphs, and Top-Down Pruning." International Joint Conference on Artificial Intelligence, 1995.](https://mlanthology.org/ijcai/1995/kohavi1995ijcai-oblivious/)BibTeX
@inproceedings{kohavi1995ijcai-oblivious,
title = {{Oblivious Decision Trees, Graphs, and Top-Down Pruning}},
author = {Kohavi, Ron and Li, Chia-Hsin},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {1995},
pages = {1071-1079},
url = {https://mlanthology.org/ijcai/1995/kohavi1995ijcai-oblivious/}
}