ForestPrune: Compact Depth-Pruned Tree Ensembles

Abstract

Tree ensembles are powerful models that achieve excellent predictive performances, but can grow to unwieldy sizes. These ensembles are often post-processed (pruned) to reduce memory footprint and improve interpretability. We present ForestPrune, a novel optimization framework to post-process tree ensembles by pruning depth layers from individual trees. Since the number of nodes in a decision tree increases exponentially with tree depth, pruning deep trees drastically compactifies ensembles. We develop a specialized optimization algorithm to efficiently obtain high-quality solutions to problems under ForestPrune. Our algorithm typically reaches good solutions in seconds for medium-size datasets and ensembles, with 10000s of rows and 100s of trees, resulting in significant speedups over existing approaches. Our experiments demonstrate that ForestPrune produces parsimonious models that outperform models extracted by existing post-processing algorithms.

Cite

Text

Liu and Mazumder. "ForestPrune: Compact Depth-Pruned Tree Ensembles." Artificial Intelligence and Statistics, 2023.

Markdown

[Liu and Mazumder. "ForestPrune: Compact Depth-Pruned Tree Ensembles." Artificial Intelligence and Statistics, 2023.](https://mlanthology.org/aistats/2023/liu2023aistats-forestprune/)

BibTeX

@inproceedings{liu2023aistats-forestprune,
  title     = {{ForestPrune: Compact Depth-Pruned Tree Ensembles}},
  author    = {Liu, Brian and Mazumder, Rahul},
  booktitle = {Artificial Intelligence and Statistics},
  year      = {2023},
  pages     = {9417-9428},
  volume    = {206},
  url       = {https://mlanthology.org/aistats/2023/liu2023aistats-forestprune/}
}