Globally Induced Forest: A Prepruning Compression Scheme

Abstract

Tree-based ensemble models are heavy memory-wise. An undesired state of affairs considering nowadays datasets, memory-constrained environment and fitting/prediction times. In this paper, we propose the Globally Induced Forest (GIF) to remedy this problem. GIF is a fast prepruning approach to build lightweight ensembles by iteratively deepening the current forest. It mixes local and global optimizations to produce accurate predictions under memory constraints in reasonable time. We show that the proposed method is more than competitive with standard tree-based ensembles under corresponding constraints, and can sometimes even surpass much larger models.

Cite

Text

Begon et al. "Globally Induced Forest: A Prepruning Compression Scheme." International Conference on Machine Learning, 2017.

Markdown

[Begon et al. "Globally Induced Forest: A Prepruning Compression Scheme." International Conference on Machine Learning, 2017.](https://mlanthology.org/icml/2017/begon2017icml-globally/)

BibTeX

@inproceedings{begon2017icml-globally,
  title     = {{Globally Induced Forest: A Prepruning Compression Scheme}},
  author    = {Begon, Jean-Michel and Joly, Arnaud and Geurts, Pierre},
  booktitle = {International Conference on Machine Learning},
  year      = {2017},
  pages     = {420-428},
  volume    = {70},
  url       = {https://mlanthology.org/icml/2017/begon2017icml-globally/}
}