Optimal Sparse Recovery with Decision Stumps

Abstract

Decision trees are widely used for their low computational cost, good predictive performance, and ability to assess the importance of features. Though often used in practice for feature selection, the theoretical guarantees of these methods are not well understood. We here obtain a tight finite sample bound for the feature selection problem in linear regression using single-depth decision trees. We examine the statistical properties of these "decision stumps" for the recovery of the s active features from p total features, where s

Cite

Text

Banihashem et al. "Optimal Sparse Recovery with Decision Stumps." AAAI Conference on Artificial Intelligence, 2023. doi:10.1609/AAAI.V37I6.25827

Markdown

[Banihashem et al. "Optimal Sparse Recovery with Decision Stumps." AAAI Conference on Artificial Intelligence, 2023.](https://mlanthology.org/aaai/2023/banihashem2023aaai-optimal/) doi:10.1609/AAAI.V37I6.25827

BibTeX

@inproceedings{banihashem2023aaai-optimal,
  title     = {{Optimal Sparse Recovery with Decision Stumps}},
  author    = {Banihashem, Kiarash and Hajiaghayi, Mohammad and Springer, Max},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2023},
  pages     = {6745-6752},
  doi       = {10.1609/AAAI.V37I6.25827},
  url       = {https://mlanthology.org/aaai/2023/banihashem2023aaai-optimal/}
}