Model Merging in Pre-Training of Large Language Models

Abstract

Model merging has emerged as a promising technique for enhancing large language models, though its application in large-scale pre-training remains relatively unexplored. In this paper, we present a comprehensive investigation of model merging techniques during the pre-training process. Through extensive experiments with both dense and Mixture-of-Experts (MoE) architectures ranging from millions to over 100 billion parameters, we demonstrate that merging checkpoints trained with constant learning rates not only achieves significant performance improvements but also enables accurate prediction of annealing behavior. These improvements lead to both more efficient model development and significantly lower training costs. Our detailed ablation studies on merging strategies and hyperparameters provide new insights into the underlying mechanisms while uncovering novel applications. Through comprehensive experimental analysis, we offer the open-source community practical pre-training guidelines for effective model merging.

Cite

Text

Li et al. "Model Merging in Pre-Training of Large Language Models." Advances in Neural Information Processing Systems, 2025.

Markdown

[Li et al. "Model Merging in Pre-Training of Large Language Models." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/li2025neurips-model/)

BibTeX

@inproceedings{li2025neurips-model,
  title     = {{Model Merging in Pre-Training of Large Language Models}},
  author    = {Li, Yunshui and Ma, Yiyuan and Yan, Shen and Zhang, Chaoyi and Liu, Jing and Lu, Jianqiao and Xu, Ziwen and Chen, Mengzhao and Wang, Minrui and Zhan, Shiyi and Ma, Jin and Lai, Xunhao and Luo, Yao and Bin, Xingyan and Ren, Hongbin and Han, Mingji and Hao, Wenhao and Yi, Bairen and Liu, LingJun and Ma, Bole and Jia, Xiaoying and Xun, Zhou and Xiang, Liang and Wu, Yonghui},
  booktitle = {Advances in Neural Information Processing Systems},
  year      = {2025},
  url       = {https://mlanthology.org/neurips/2025/li2025neurips-model/}
}