Mitra: Mixed Synthetic Priors for Enhancing Tabular Foundation Models

Abstract

Since the seminal work of TabPFN, research on tabular foundation models (TFMs) based on in-context learning (ICL) has challenged long-standing paradigms in machine learning. Without seeing any real-world data, models pretrained on purely synthetic datasets generalize remarkably well across diverse datasets, often using only a moderate number of in-context examples. This shifts the focus in tabular machine learning from model architecture design to the design of synthetic datasets, or, more precisely, to the prior distributions that generate them. Yet the guiding principles for prior design remain poorly understood. This work marks the first attempt to address the gap. We systematically investigate and identify key properties of synthetic priors that allow pretrained TFMs to generalize well. Based on these insights, we introduce Mitra, a TFM trained on a curated mixture of synthetic priors selected for their diversity, distinctiveness, and performance on real-world tabular data. Mitra consistently outperforms state-of-the-art TFMs, such as TabPFNv2 and TabICL, across both classification and regression benchmarks, with better sample efficiency.

Cite

Text

Zhang et al. "Mitra: Mixed Synthetic Priors for Enhancing Tabular Foundation Models." Advances in Neural Information Processing Systems, 2025.

Markdown

[Zhang et al. "Mitra: Mixed Synthetic Priors for Enhancing Tabular Foundation Models." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/zhang2025neurips-mitra/)

BibTeX

@inproceedings{zhang2025neurips-mitra,
  title     = {{Mitra: Mixed Synthetic Priors for Enhancing Tabular Foundation Models}},
  author    = {Zhang, Xiyuan and Maddix, Danielle C. and Yin, Junming and Erickson, Nick and Ansari, Abdul Fatir and Han, Boran and Zhang, Shuai and Akoglu, Leman and Faloutsos, Christos and Mahoney, Michael W. and Hu, Cuixiong and Rangwala, Huzefa and Karypis, George and Wang, Bernie},
  booktitle = {Advances in Neural Information Processing Systems},
  year      = {2025},
  url       = {https://mlanthology.org/neurips/2025/zhang2025neurips-mitra/}
}