Rethinking Neural Multi-Objective Combinatorial Optimization via Neat Weight Embedding

Abstract

Recent decomposition-based neural multi-objective combinatorial optimization (MOCO) methods struggle to achieve desirable performance. Even equipped with complex learning techniques, they often suffer from significant optimality gaps in weight-specific subproblems. To address this challenge, we propose a neat weight embedding method to learn weight-specific representations, which captures weight-instance interaction for the subproblems and was overlooked by most current methods. We demonstrate the potentials of our method in two instantiations. First, we introduce a succinct addition model to learn weight-specific node embeddings, which surpassed most existing neural methods. Second, we design an enhanced conditional attention model to simultaneously learn the weight embedding and node embeddings, which yielded new state-of-the-art performance. Experimental results on classic MOCO problems verified the superiority of our method. Remarkably, our method also exhibits favorable generalization performance across problem sizes, even outperforming the neural method specialized for boosting size generalization.

Cite

Text

Chen et al. "Rethinking Neural Multi-Objective Combinatorial Optimization via Neat Weight Embedding." International Conference on Learning Representations, 2025.

Markdown

[Chen et al. "Rethinking Neural Multi-Objective Combinatorial Optimization via Neat Weight Embedding." International Conference on Learning Representations, 2025.](https://mlanthology.org/iclr/2025/chen2025iclr-rethinking/)

BibTeX

@inproceedings{chen2025iclr-rethinking,
  title     = {{Rethinking Neural Multi-Objective Combinatorial Optimization via Neat Weight Embedding}},
  author    = {Chen, Jinbiao and Cao, Zhiguang and Wang, Jiahai and Wu, Yaoxin and Qin, Hanzhang and Zhang, Zizhen and Gong, Yue-Jiao},
  booktitle = {International Conference on Learning Representations},
  year      = {2025},
  url       = {https://mlanthology.org/iclr/2025/chen2025iclr-rethinking/}
}