Fine-Tuning the Retrieval Mechanism for Tabular Deep Learning
Abstract
While interests in tabular deep learning has significantly grown, conventional tree-based models still outperform deep learning methods. To narrow this performance gap, we explore the innovative retrieval mechanism, a methodology that allows neural networks to refer to other data points while making predictions. Our experiments reveal that retrieval-based training, especially when fine-tuning the pretrained TabPFN model, notably surpasses existing methods. Moreover, the extensive pretraining plays a crucial role to enhance the performance of the model. These insights imply that blending the retrieval mechanism with pretraining and transfer learning schemes offers considerable potential for advancing the field of tabular deep learning.
Cite
Text
den Breejen et al. "Fine-Tuning the Retrieval Mechanism for Tabular Deep Learning." NeurIPS 2023 Workshops: TRL, 2023.Markdown
[den Breejen et al. "Fine-Tuning the Retrieval Mechanism for Tabular Deep Learning." NeurIPS 2023 Workshops: TRL, 2023.](https://mlanthology.org/neuripsw/2023/denbreejen2023neuripsw-finetuning/)BibTeX
@inproceedings{denbreejen2023neuripsw-finetuning,
title = {{Fine-Tuning the Retrieval Mechanism for Tabular Deep Learning}},
author = {den Breejen, Felix and Bae, Sangmin and Cha, Stephen and Kim, Tae-Young and Koh, Seoung Hyun and Yun, Se-Young},
booktitle = {NeurIPS 2023 Workshops: TRL},
year = {2023},
url = {https://mlanthology.org/neuripsw/2023/denbreejen2023neuripsw-finetuning/}
}