MERCS: Multi-Directional Ensembles of Regression and Classification Trees
Abstract
Learning a function f(X) that predicts Y from X is the archetypal Machine Learning (ML) problem. Typically, both sets of attributes (i.e., X,Y) have to be known before a model can be trained. When this is not the case, or when functions f(X) that predict Y from X are needed for varying X and Y, this may introduce significant overhead (separate learning runs for each function). In this paper, we explore the possibility of omitting the specification of X and Y at training time altogether, by learning a multi-directional, or versatile model, which will allow prediction of any Y from any X. Specifically, we introduce a decision tree-based paradigm that generalizes the well-known Random Forests approach to allow for multi-directionality. The result of these efforts is a novel method called MERCS: Multi-directional Ensembles of Regression and Classification treeS. Experiments show the viability of the approach.
Cite
Text
Van Wolputte et al. "MERCS: Multi-Directional Ensembles of Regression and Classification Trees." AAAI Conference on Artificial Intelligence, 2018. doi:10.1609/AAAI.V32I1.11735Markdown
[Van Wolputte et al. "MERCS: Multi-Directional Ensembles of Regression and Classification Trees." AAAI Conference on Artificial Intelligence, 2018.](https://mlanthology.org/aaai/2018/wolputte2018aaai-mercs/) doi:10.1609/AAAI.V32I1.11735BibTeX
@inproceedings{wolputte2018aaai-mercs,
title = {{MERCS: Multi-Directional Ensembles of Regression and Classification Trees}},
author = {Van Wolputte, Elia and Korneva, Evgeniya and Blockeel, Hendrik},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2018},
pages = {4276-4283},
doi = {10.1609/AAAI.V32I1.11735},
url = {https://mlanthology.org/aaai/2018/wolputte2018aaai-mercs/}
}