Quantization Based Optimization : Alternative Stochastic Approximation of Global Optimization

Abstract

In this study, we propose a global optimization algorithm based on quantizing the energy level of an objective function in an NP-hard problem. According to the white noise hypothesis for a quantization error with a dense and uniform distribution, we can regard the quantization error as i.i.d. white noise. According to stochastic analysis, the proposed algorithm converges weakly only under conditions satisfying Lipschitz continuity, instead of local convergence properties such as the Hessian constraint of the objective function. This shows that the proposed algorithm ensures global optimization by Laplace's condition. Numerical experiments show that the proposed algorithm outperforms conventional learning methods in solving NP-hard optimization problems such as the traveling salesman problem.

Cite

Text

Seok and Cho. "Quantization Based Optimization : Alternative Stochastic Approximation of Global Optimization." NeurIPS 2022 Workshops: OPT, 2022.

Markdown

[Seok and Cho. "Quantization Based Optimization : Alternative Stochastic Approximation of Global Optimization." NeurIPS 2022 Workshops: OPT, 2022.](https://mlanthology.org/neuripsw/2022/seok2022neuripsw-quantization/)

BibTeX

@inproceedings{seok2022neuripsw-quantization,
  title     = {{Quantization Based Optimization : Alternative Stochastic Approximation of Global Optimization}},
  author    = {Seok, Jinwuk and Cho, Changsik},
  booktitle = {NeurIPS 2022 Workshops: OPT},
  year      = {2022},
  url       = {https://mlanthology.org/neuripsw/2022/seok2022neuripsw-quantization/}
}