MVMoE: Multi-Task Vehicle Routing Solver with Mixture-of-Experts

Abstract

Learning to solve vehicle routing problems (VRPs) has garnered much attention. However, most neural solvers are only structured and trained independently on a specific problem, making them less generic and practical. In this paper, we aim to develop a unified neural solver that can cope with a range of VRP variants simultaneously. Specifically, we propose a multi-task vehicle routing solver with mixture-of-experts (MVMoE), which greatly enhances the model capacity without a proportional increase in computation. We further develop a hierarchical gating mechanism for the MVMoE, delivering a good trade-off between empirical performance and computational complexity. Experimentally, our method significantly promotes zero-shot generalization performance on 10 unseen VRP variants, and showcases decent results on the few-shot setting and real-world benchmark instances. We further conduct extensive studies on the effect of MoE configurations in solving VRPs, and observe the superiority of hierarchical gating when facing out-of-distribution data. The source code is available at: https://github.com/RoyalSkye/Routing-MVMoE.

Cite

Text

Zhou et al. "MVMoE: Multi-Task Vehicle Routing Solver with Mixture-of-Experts." International Conference on Machine Learning, 2024.

Markdown

[Zhou et al. "MVMoE: Multi-Task Vehicle Routing Solver with Mixture-of-Experts." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/zhou2024icml-mvmoe/)

BibTeX

@inproceedings{zhou2024icml-mvmoe,
  title     = {{MVMoE: Multi-Task Vehicle Routing Solver with Mixture-of-Experts}},
  author    = {Zhou, Jianan and Cao, Zhiguang and Wu, Yaoxin and Song, Wen and Ma, Yining and Zhang, Jie and Chi, Xu},
  booktitle = {International Conference on Machine Learning},
  year      = {2024},
  pages     = {61804-61824},
  volume    = {235},
  url       = {https://mlanthology.org/icml/2024/zhou2024icml-mvmoe/}
}