Improved Dimensionality Dependence for Zeroth-Order Optimisation over Cross-Polytopes
Abstract
This work proposes an algorithm improving the dimensionality dependence for gradient-free optimisation over cross-polytopes, which has many applications such as adversarial attacks, explainable AI and sparse regression. For bandit convex optimisation with two-point feedback over cross-polytopes, the state-of-the-art algorithms have a dimensionality dependence of $\mathcal{O}(\sqrt{d\log d})$, while the known lower bound is of the form $\Omega(\sqrt{d(\log d)^{-1}})$. We propose a mirror descent algorithm equipped with a symmetric version of the negative $\frac{1}{2}$-Tsallis entropy. Combined with an $\ell_1$-ellipsoidal smoothing-based gradient estimator, the proposed algorithm guarantees a dimensionality dependence on $\mathcal{O}(\sqrt{d})$, which improves the state-of-the-art algorithms by a factor of $\sqrt{\log d}$. The idea can be further applied to optimising non-smooth and non-convex functions. We propose an algorithm with a convergence depending on $\mathcal{O}(d)$, which is the best-known dimensionality dependence.
Cite
Text
Shao. "Improved Dimensionality Dependence for Zeroth-Order Optimisation over Cross-Polytopes." International Conference on Machine Learning, 2024.Markdown
[Shao. "Improved Dimensionality Dependence for Zeroth-Order Optimisation over Cross-Polytopes." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/shao2024icml-improved/)BibTeX
@inproceedings{shao2024icml-improved,
title = {{Improved Dimensionality Dependence for Zeroth-Order Optimisation over Cross-Polytopes}},
author = {Shao, Weijia},
booktitle = {International Conference on Machine Learning},
year = {2024},
pages = {44413-44428},
volume = {235},
url = {https://mlanthology.org/icml/2024/shao2024icml-improved/}
}