Uncertainty Estimation with Recursive Feature Machines
Abstract
In conventional regression analysis, predictions are typically represented as point estimates derived from covariates. The Gaussian Process (GP) offer a kernel-based framework that predicts and quantifies associated uncertainties. However, kernel-based methods often underperform ensemble-based decision tree approaches in regression tasks involving tabular and categorical data. Recently, Recursive Feature Machines (RFMs) were proposed as a novel feature-learning kernel which strengthens the capabilities of kernel machines. In this study, we harness the power of these RFMs in a probabilistic GP-based approach to enhance uncertainty estimation through feature extraction within kernel methods. We employ this learned kernel for in-depth uncertainty analysis. On tabular datasets, our RFM-based method surpasses other leading uncertainty estimation techniques, including NGBoost and CatBoost-ensemble. Additionally, when assessing out-of-distribution performance, we found that boosting-based methods are surpassed by our RFM-based approach.
Cite
Text
Gedon et al. "Uncertainty Estimation with Recursive Feature Machines." Uncertainty in Artificial Intelligence, 2024.Markdown
[Gedon et al. "Uncertainty Estimation with Recursive Feature Machines." Uncertainty in Artificial Intelligence, 2024.](https://mlanthology.org/uai/2024/gedon2024uai-uncertainty/)BibTeX
@inproceedings{gedon2024uai-uncertainty,
title = {{Uncertainty Estimation with Recursive Feature Machines}},
author = {Gedon, Daniel and Abedsoltan, Amirhesam and Schön, Thomas B. and Belkin, Mikhail},
booktitle = {Uncertainty in Artificial Intelligence},
year = {2024},
pages = {1408-1437},
volume = {244},
url = {https://mlanthology.org/uai/2024/gedon2024uai-uncertainty/}
}