Oblique Bayesian Additive Regression Trees
Abstract
Current implementations of Bayesian Additive Regression Trees (BART) are based on axis-aligned decision rules that recursively partition the feature space using a single feature at a time. Several authors have demonstrated that oblique trees, whose decision rules are based on linear combinations of features, can sometimes yield better predictions than axis-aligned trees and exhibit excellent theoretical properties. We develop an oblique version of BART that leverages a data-adaptive decision rule prior that recursively partitions the feature space along random hyperplanes. Using several synthetic and real-world benchmark datasets, we systematically compared our oblique BART implementation to axis-aligned BART and other tree ensemble methods, finding that oblique BART was competitive with --- and sometimes much better than --- those methods.
Cite
Text
Nguyen et al. "Oblique Bayesian Additive Regression Trees." Transactions on Machine Learning Research, 2025.Markdown
[Nguyen et al. "Oblique Bayesian Additive Regression Trees." Transactions on Machine Learning Research, 2025.](https://mlanthology.org/tmlr/2025/nguyen2025tmlr-oblique/)BibTeX
@article{nguyen2025tmlr-oblique,
title = {{Oblique Bayesian Additive Regression Trees}},
author = {Nguyen, Paul-Hieu V. and Yee, Ryan and Deshpande, Sameer},
journal = {Transactions on Machine Learning Research},
year = {2025},
url = {https://mlanthology.org/tmlr/2025/nguyen2025tmlr-oblique/}
}