TabFlex: Scaling Tabular Learning to Millions with Linear Attention

Abstract

Leveraging the in-context learning (ICL) capability of Large Language Models (LLMs) for tabular classification has gained significant attention for its training-free adaptability across diverse datasets. Recent advancements, like TabPFN, excel in small-scale tabular datasets but struggle to scale for large and complex datasets. Our work enhances the efficiency and scalability of TabPFN for larger datasets by incorporating linear attention mechanisms as a scalable alternative to complexity-quadratic self-attention. Our model, TabFlex, efficiently handles tabular datasets with thousands of features and hundreds of classes, scaling seamlessly to millions of samples. For instance, TabFlex processes the poker-hand dataset with over a million samples in just 5 seconds. Our extensive evaluations demonstrate that TabFlex can achieve over a 2$\times$ speedup compared to TabPFN and a 1.5$\times$ speedup over XGBoost, outperforming 25 tested baselines in terms of efficiency across a diverse range of datasets. Furthermore, TabFlex remains highly effective on large-scale datasets, delivering strong performance with significantly reduced computational costs, especially when combined with data-efficient techniques such as dimensionality reduction and data sampling.

Cite

Text

Zeng et al. "TabFlex: Scaling Tabular Learning to Millions with Linear Attention." Proceedings of the 42nd International Conference on Machine Learning, 2025.

Markdown

[Zeng et al. "TabFlex: Scaling Tabular Learning to Millions with Linear Attention." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/zeng2025icml-tabflex/)

BibTeX

@inproceedings{zeng2025icml-tabflex,
  title     = {{TabFlex: Scaling Tabular Learning to Millions with Linear Attention}},
  author    = {Zeng, Yuchen and Dinh, Tuan and Kang, Wonjun and Mueller, Andreas C},
  booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
  year      = {2025},
  pages     = {74051-74079},
  volume    = {267},
  url       = {https://mlanthology.org/icml/2025/zeng2025icml-tabflex/}
}