Shortcut-Connected Expert Parallelism for Accelerating Mixture of Experts

Abstract

Expert parallelism has emerged as a key strategy for distributing the computational workload of sparsely-gated mixture-of-experts (MoE) models across multiple devices, enabling the processing of increasingly large-scale models. However, the All-to-All communication inherent to expert parallelism poses a significant bottleneck, limiting the efficiency of MoE models. Although existing optimization methods partially mitigate this issue, they remain constrained by the sequential dependency between communication and computation operations. To address this challenge, we propose ScMoE, a novel shortcut-connected MoE architecture integrated with an overlapping parallelization strategy. ScMoE decouples communication from its conventional sequential ordering, enabling up to 100% overlap with computation. Compared to the prevalent top-2 MoE baseline, ScMoE achieves speedups of $1.49\times$ in training and $1.82\times$ in inference. Moreover, our experiments and analyses indicate that ScMoE not only achieves comparable but in some instances surpasses the model quality of existing approaches.

Cite

Text

Cai et al. "Shortcut-Connected Expert Parallelism for Accelerating Mixture of Experts." Proceedings of the 42nd International Conference on Machine Learning, 2025.

Markdown

[Cai et al. "Shortcut-Connected Expert Parallelism for Accelerating Mixture of Experts." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/cai2025icml-shortcutconnected/)

BibTeX

@inproceedings{cai2025icml-shortcutconnected,
  title     = {{Shortcut-Connected Expert Parallelism for Accelerating Mixture of Experts}},
  author    = {Cai, Weilin and Jiang, Juyong and Qin, Le and Cui, Junwei and Kim, Sunghun and Huang, Jiayi},
  booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
  year      = {2025},
  pages     = {6211-6228},
  volume    = {267},
  url       = {https://mlanthology.org/icml/2025/cai2025icml-shortcutconnected/}
}