IngesTables: Scalable and Efficient Training of LLM-Enabled Tabular Foundation Models
Abstract
There is a massive amount of tabular data that can be taken advantage of via `foundation models' to improve prediction performance for downstream tabular prediction tasks. However, numerous challenges constitute bottlenecks in building tabular foundation models, including learning semantic relevance between tables and features, mismatched schemes, arbitrarily high cardinality for categorical values, and scalability to many tables, rows and features. We propose IngesTables, a novel canonical tabular foundation model building framework, designed to address the aforementioned challenges. IngesTables employs LLMs to encode representations of table/feature semantics and the relationships, that are then modeled via an attention-based tabular architecture. Unlike other LLM-based approaches, IngesTables is much cheaper to train and faster to run inference, because of how LLM-generated embeddings are defined and cached. We show that IngesTables demonstrates significant improvements over commonly-used models like XGBoost on clinical trial datasets in standard supervised learning settings, and is competitive with tabular prediction models that are specialized for clinical trial datasets without incurring LLM-level cost and latency.
Cite
Text
Yak et al. "IngesTables: Scalable and Efficient Training of LLM-Enabled Tabular Foundation Models." NeurIPS 2023 Workshops: TRL, 2023.Markdown
[Yak et al. "IngesTables: Scalable and Efficient Training of LLM-Enabled Tabular Foundation Models." NeurIPS 2023 Workshops: TRL, 2023.](https://mlanthology.org/neuripsw/2023/yak2023neuripsw-ingestables/)BibTeX
@inproceedings{yak2023neuripsw-ingestables,
title = {{IngesTables: Scalable and Efficient Training of LLM-Enabled Tabular Foundation Models}},
author = {Yak, Scott and Dong, Yihe and Gonzalvo, Javier and Arik, Sercan},
booktitle = {NeurIPS 2023 Workshops: TRL},
year = {2023},
url = {https://mlanthology.org/neuripsw/2023/yak2023neuripsw-ingestables/}
}