Select and Optimize: Learning to Solve Large-Scale TSP Instances

Abstract

Learning-based algorithms to solve TSP are getting popular in recent years, but most existing works cannot solve very large-scale TSP instances within a limited time. To solve this problem, this paper introduces a creative and distinctive method to select and locally optimize sub-parts of a solution. Concretely, we design a novel framework to generalize a small-scale selector-and-optimizer network to large-scale TSP instances by iteratively selecting while optimizing one sub-problem. At each iteration, the running time of sub-problem sampling and selection is significantly reduced due to the full use of parallel computing. Our neural model is well-designed to exploit the characteristics of the sub-problems. Furthermore, we introduce a trick called destroy-and-repair to avoid the local minimum of the iterative algorithm from a global perspective. Extensive experiments show that our method accelerates state-of-the-art learning-based algorithms more than 2x while achieving better solution quality on large-scale TSP instances ranging in size from 200 to 20,000.

Cite

Text

Cheng et al. "Select and Optimize: Learning to Solve Large-Scale TSP Instances." Artificial Intelligence and Statistics, 2023.

Markdown

[Cheng et al. "Select and Optimize: Learning to Solve Large-Scale TSP Instances." Artificial Intelligence and Statistics, 2023.](https://mlanthology.org/aistats/2023/cheng2023aistats-select/)

BibTeX

@inproceedings{cheng2023aistats-select,
  title     = {{Select and Optimize: Learning to Solve Large-Scale TSP Instances}},
  author    = {Cheng, Hanni and Zheng, Haosi and Cong, Ya and Jiang, Weihao and Pu, Shiliang},
  booktitle = {Artificial Intelligence and Statistics},
  year      = {2023},
  pages     = {1219-1231},
  volume    = {206},
  url       = {https://mlanthology.org/aistats/2023/cheng2023aistats-select/}
}