Generalized Random Forests Using Fixed-Point Trees
Abstract
We propose a computationally efficient alternative to generalized random forests (GRFs) for estimating heterogeneous effects in large dimensions. While GRFs rely on a gradient-based splitting criterion, which in large dimensions is computationally expensive and unstable, our method introduces a fixed-point approximation that eliminates the need for Jacobian estimation. This gradient-free approach preserves GRF’s theoretical guarantees of consistency and asymptotic normality while significantly improving computational efficiency. We demonstrate that our method achieves a speedup of multiple times over standard GRFs without compromising statistical accuracy. Experiments on both simulated and real-world data validate our approach. Our findings suggest that the proposed method is a scalable alternative for localized effect estimation in machine learning and causal inference applications.
Cite
Text
Fleischer et al. "Generalized Random Forests Using Fixed-Point Trees." Proceedings of the 42nd International Conference on Machine Learning, 2025.Markdown
[Fleischer et al. "Generalized Random Forests Using Fixed-Point Trees." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/fleischer2025icml-generalized/)BibTeX
@inproceedings{fleischer2025icml-generalized,
title = {{Generalized Random Forests Using Fixed-Point Trees}},
author = {Fleischer, David and Stephens, David A. and Yang, Archer Y.},
booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
year = {2025},
pages = {17262-17305},
volume = {267},
url = {https://mlanthology.org/icml/2025/fleischer2025icml-generalized/}
}