OC1: A Randomized Induction of Oblique Decision Trees
Abstract
This paper introduces OC1, a new algorithm for generating multivariate decision trees. Multivariate trees classify examples by testing linear combinations of the features at each non-leaf node of the tree. Each test is equivalent to a hyperplane at an oblique orientation to the axes. Because of the computational intractability of finding an optimal orientation for these hyperplanes, heuristic methods must be used to produce good trees. This paper explores a new method that combines deterministic and randomized procedures to search for a good tree. Experiments on several different real-world data sets demonstrate that the method consistently finds much smaller trees than comparable methods using univariate tests. In addition, the accuracy of the trees found with our method matches or exceeds the best results of other machine learning methods. 1 Introduction Decision trees (DTs) have been used quite extensively in the machine learning literature for a wide range of classification probl...
Cite
Text
Murthy et al. "OC1: A Randomized Induction of Oblique Decision Trees." AAAI Conference on Artificial Intelligence, 1993.Markdown
[Murthy et al. "OC1: A Randomized Induction of Oblique Decision Trees." AAAI Conference on Artificial Intelligence, 1993.](https://mlanthology.org/aaai/1993/murthy1993aaai-oc/)BibTeX
@inproceedings{murthy1993aaai-oc,
title = {{OC1: A Randomized Induction of Oblique Decision Trees}},
author = {Murthy, Sreerama K. and Kasif, Simon and Salzberg, Steven and Beigel, Richard},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {1993},
pages = {322-327},
url = {https://mlanthology.org/aaai/1993/murthy1993aaai-oc/}
}