On the Convergence of CART Under Sufficient Impurity Decrease Condition

Abstract

The decision tree is a flexible machine-learning model that finds its success in numerous applications. It is usually fitted in a recursively greedy manner using CART. In this paper, we study the convergence rate of CART under a regression setting. First, we prove an upper bound on the prediction error of CART under a sufficient impurity decrease (SID) condition \cite{chi2020asymptotic} -- our result is an improvement over the known result by \cite{chi2020asymptotic} under a similar assumption. We show via examples that this error bound cannot be further improved by more than a constant or a log factor. Second, we introduce a few easy-to-check sufficient conditions of the SID condition. In particular, we show that the SID condition can be satisfied by an additive model when the component functions satisfy a ``locally reverse Poincare inequality". We discuss a few familiar function classes in non-parametric estimation to demonstrate the usefulness of this conception.

Cite

Text

Mazumder and Wang. "On the Convergence of CART Under Sufficient Impurity Decrease Condition." Neural Information Processing Systems, 2023.

Markdown

[Mazumder and Wang. "On the Convergence of CART Under Sufficient Impurity Decrease Condition." Neural Information Processing Systems, 2023.](https://mlanthology.org/neurips/2023/mazumder2023neurips-convergence/)

BibTeX

@inproceedings{mazumder2023neurips-convergence,
  title     = {{On the Convergence of CART Under Sufficient Impurity Decrease Condition}},
  author    = {Mazumder, Rahul and Wang, Haoyue},
  booktitle = {Neural Information Processing Systems},
  year      = {2023},
  url       = {https://mlanthology.org/neurips/2023/mazumder2023neurips-convergence/}
}