How Distributed Collaboration Influences the Diffusion Model Training? a Theoretical Perspective
Abstract
This paper examines the theoretical performance of distributed diffusion models in environments where computational resources and data availability vary significantly among workers. Traditional models centered on single-worker scenarios fall short in such distributed settings, particularly when some workers are resource-constrained. This discrepancy in resources and data diversity challenges the assumption of accurate score function estimation foundational to single-worker models. We establish the inaugural generation error bound for distributed diffusion models in resource-limited settings, establishing a linear relationship with the data dimension $d$ and consistency with established single-worker results. Our analysis highlights the critical role of hyperparameter selection in influencing the training dynamics, which are key to the performance of model generation. This study provides a streamlined theoretical approach to optimizing distributed diffusion models, paving the way for future research in this area.
Cite
Text
Qiao et al. "How Distributed Collaboration Influences the Diffusion Model Training? a Theoretical Perspective." Proceedings of the 42nd International Conference on Machine Learning, 2025.Markdown
[Qiao et al. "How Distributed Collaboration Influences the Diffusion Model Training? a Theoretical Perspective." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/qiao2025icml-distributed/)BibTeX
@inproceedings{qiao2025icml-distributed,
title = {{How Distributed Collaboration Influences the Diffusion Model Training? a Theoretical Perspective}},
author = {Qiao, Jing and Liu, Yu and Yuan, Yuan and Zhang, Xiao and Cai, Zhipeng and Yu, Dongxiao},
booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
year = {2025},
pages = {50171-50188},
volume = {267},
url = {https://mlanthology.org/icml/2025/qiao2025icml-distributed/}
}