Multiclass Alternating Decision Trees

Abstract

The alternating decision tree (ADTree) is a successful classification technique that combines decision trees with the predictive accuracy of boosting into a set of interpretable classification rules. The original formulation of the tree induction algorithm restricted attention to binary classification problems. This paper empirically evaluates several wrapper methods for extending the algorithm to the multiclass case by splitting the problem into several two-class problems. Seeking a more natural solution we then adapt the multiclass LogitBoost and AdaBoost. MH procedures to induce alternating decision trees directly. Experimental results confirm that these procedures are comparable with wrapper methods that are based on the original ADTree formulation in accuracy, while inducing much smaller trees.

Cite

Text

Holmes et al. "Multiclass Alternating Decision Trees." European Conference on Machine Learning, 2002. doi:10.1007/3-540-36755-1_14

Markdown

[Holmes et al. "Multiclass Alternating Decision Trees." European Conference on Machine Learning, 2002.](https://mlanthology.org/ecmlpkdd/2002/holmes2002ecml-multiclass/) doi:10.1007/3-540-36755-1_14

BibTeX

@inproceedings{holmes2002ecml-multiclass,
  title     = {{Multiclass Alternating Decision Trees}},
  author    = {Holmes, Geoffrey and Pfahringer, Bernhard and Kirkby, Richard and Frank, Eibe and Hall, Mark},
  booktitle = {European Conference on Machine Learning},
  year      = {2002},
  pages     = {161-172},
  doi       = {10.1007/3-540-36755-1_14},
  url       = {https://mlanthology.org/ecmlpkdd/2002/holmes2002ecml-multiclass/}
}