Sample-and-Bound for Non-Convex Optimization

Abstract

Standard approaches for global optimization of non-convex functions, such as branch-and-bound, maintain partition trees to systematically prune the domain. The tree size grows exponentially in the number of dimensions. We propose new sampling-based methods for non-convex optimization that adapts Monte Carlo Tree Search (MCTS) to improve efficiency. Instead of the standard use of visitation count in Upper Confidence Bounds, we utilize numerical overapproximations of the objective as an uncertainty metric, and also take into account of sampled estimates of first-order and second-order information. The Monte Carlo tree in our approach avoids the usual fixed combinatorial patterns in growing the tree, and aggressively zooms into the promising regions, while still balancing exploration and exploitation. We evaluate the proposed algorithms on high-dimensional non-convex optimization benchmarks against competitive baselines and analyze the effects of the hyper parameters.

Cite

Text

Zhai et al. "Sample-and-Bound for Non-Convex Optimization." AAAI Conference on Artificial Intelligence, 2024. doi:10.1609/AAAI.V38I18.30074

Markdown

[Zhai et al. "Sample-and-Bound for Non-Convex Optimization." AAAI Conference on Artificial Intelligence, 2024.](https://mlanthology.org/aaai/2024/zhai2024aaai-sample/) doi:10.1609/AAAI.V38I18.30074

BibTeX

@inproceedings{zhai2024aaai-sample,
  title     = {{Sample-and-Bound for Non-Convex Optimization}},
  author    = {Zhai, Yaoguang and Qin, Zhizhen and Gao, Sicun},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2024},
  pages     = {20847-20855},
  doi       = {10.1609/AAAI.V38I18.30074},
  url       = {https://mlanthology.org/aaai/2024/zhai2024aaai-sample/}
}