Scaling Simultaneous Optimistic Optimization for High-Dimensional Non-Convex Functions with Low Effective Dimensions

Abstract

Simultaneous optimistic optimization (SOO) is a recently proposed global optimization method with a strong theoretical foundation. Previous studies have shown that SOO has a good performance in low-dimensional optimization problems, however, its performance is unsatisfactory when the dimensionality is high. This paper adapts random embedding to scaling SOO, resulting in the RESOO algorithm. We prove that the simple regret of RESOO depends only on the effective dimension of the problem, while that of SOO depends on the dimension of the solution space. Empirically, on some high-dimensional non-convex testing functions as well as hyper-parameter tuning tasks for multi-class support vector machines, RESOO shows significantly improved performance from SOO.

Cite

Text

Qian and Yu. "Scaling Simultaneous Optimistic Optimization for High-Dimensional Non-Convex Functions with Low Effective Dimensions." AAAI Conference on Artificial Intelligence, 2016. doi:10.1609/AAAI.V30I1.10288

Markdown

[Qian and Yu. "Scaling Simultaneous Optimistic Optimization for High-Dimensional Non-Convex Functions with Low Effective Dimensions." AAAI Conference on Artificial Intelligence, 2016.](https://mlanthology.org/aaai/2016/qian2016aaai-scaling/) doi:10.1609/AAAI.V30I1.10288

BibTeX

@inproceedings{qian2016aaai-scaling,
  title     = {{Scaling Simultaneous Optimistic Optimization for High-Dimensional Non-Convex Functions with Low Effective Dimensions}},
  author    = {Qian, Hong and Yu, Yang},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2016},
  pages     = {2000-2006},
  doi       = {10.1609/AAAI.V30I1.10288},
  url       = {https://mlanthology.org/aaai/2016/qian2016aaai-scaling/}
}