MINT-1T: Scaling Open-Source Multimodal Data by 10x: A Multimodal Dataset with One Trillion Tokens
Abstract
Multimodal interleaved datasets featuring free-form interleaved sequences of images and text are crucial for training frontier large multimodal models (LMMs). Despite the rapid progression of open-source LMMs, there remains a pronounced scarcity of large-scale, open-source multimodal interleaved datasets.In response, we introduce MINT-1T, the most extensive and diverse open-source Multimodal INTerleaved dataset to date. MINT-1T comprises of one trillion text tokens and 3.4 billion images, a 10x scale-up from existing open-source datasets. Additionally, we include previously untapped sources such as PDFs and ArXiv papers. As scaling multimodal interleaved datasets requires substantial engineering effort, sharing the data curation process and releasing the dataset greatly benefits the community. Our experiments show that LMMs trained on MINT-1T rival the performance of models trained on the previous leading dataset, OBELICS. We release our data at https://github.com/mlfoundations/MINT-1T.
Cite
Text
Awadalla et al. "MINT-1T: Scaling Open-Source Multimodal Data by 10x: A Multimodal Dataset with One Trillion Tokens." Neural Information Processing Systems, 2024. doi:10.52202/079017-1160Markdown
[Awadalla et al. "MINT-1T: Scaling Open-Source Multimodal Data by 10x: A Multimodal Dataset with One Trillion Tokens." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/awadalla2024neurips-mint1t/) doi:10.52202/079017-1160BibTeX
@inproceedings{awadalla2024neurips-mint1t,
title = {{MINT-1T: Scaling Open-Source Multimodal Data by 10x: A Multimodal Dataset with One Trillion Tokens}},
author = {Awadalla, Anas and Xue, Le and Lo, Oscar and Shu, Manli and Lee, Hannah and Guha, Etash and Jordan, Matt and Shen, Sheng and Awadalla, Mohamed and Savarese, Silvio and Xiong, Caiming and Xu, Ran and Choi, Yejin and Schmidt, Ludwig},
booktitle = {Neural Information Processing Systems},
year = {2024},
doi = {10.52202/079017-1160},
url = {https://mlanthology.org/neurips/2024/awadalla2024neurips-mint1t/}
}