A System for Induction of Oblique Decision Trees
Abstract
This article describes a new system for induction of oblique decision trees. This system, OC1, combines deterministic hill-climbing with two forms of randomization to find a good oblique split (in the form of a hyperplane) at each node of a decision tree. Oblique decision tree methods are tuned especially for domains in which the attributes are numeric, although they can be adapted to symbolic or mixed symbolic/numeric attributes. We present extensive empirical studies, using both real and artificial data, that analyze OC1's ability to construct oblique trees that are smaller and more accurate than their axis-parallel counterparts. We also examine the benefits of randomization for the construction of oblique decision trees.
Cite
Text
Murthy et al. "A System for Induction of Oblique Decision Trees." Journal of Artificial Intelligence Research, 1994. doi:10.1613/JAIR.63Markdown
[Murthy et al. "A System for Induction of Oblique Decision Trees." Journal of Artificial Intelligence Research, 1994.](https://mlanthology.org/jair/1994/murthy1994jair-system/) doi:10.1613/JAIR.63BibTeX
@article{murthy1994jair-system,
title = {{A System for Induction of Oblique Decision Trees}},
author = {Murthy, Sreerama K. and Kasif, Simon and Salzberg, Steven},
journal = {Journal of Artificial Intelligence Research},
year = {1994},
pages = {1-32},
doi = {10.1613/JAIR.63},
volume = {2},
url = {https://mlanthology.org/jair/1994/murthy1994jair-system/}
}