Zeroth-Order Stochastic Approximation Algorithms for DR-Submodular Optimization
Abstract
In this paper, we study approximation algorithms for several classes of DR-submodular optimization problems, where DR is short for diminishing return. Following a newly introduced algorithm framework for zeroth-order stochastic approximation methods, we first propose algorithms {\bf CG-ZOSA} and {\bf RG-ZOSA} for smooth DR-submodular optimization based on the coordinate-wise gradient estimator and the randomized gradient estimator, respectively. Our theoretical analysis proves that \rm{\bf{CG-ZOSA}} can reach a solution whose expected objective value exceeds $(1-e^{-1}-\epsilon^{2})$OPT$-\epsilon$ after $\mathcal{O}(\epsilon^{-2})$ iterations and $\mathcal{O}(N^{2/3}d\epsilon^{-2})$ oracle calls, where $d$ represents the problem dimension. On the other hand, \rm{\bf{RG-ZOSA}} improves the approximation ratio to $(1-e^{-1}-\epsilon^{2}/d)$ while maintaining the same overall oracle complexity. For non-smooth up-concave maximization problems, we propose a novel auxiliary function based on a smoothed objective function and introduce the \rm{\bf{NZOSA}} algorithm. This algorithm achieves an approximation ratio of $(1-e^{-1}-\epsilon \ln \epsilon^{-1}- \epsilon^{2}\ln \epsilon^{-1})$ with $\mathcal{O}(d\epsilon^{-2})$ iterations and $\mathcal{O}(N^{2/3}d^{3/2} \epsilon^{-3})$ oracle calls. We also extend \rm{\bf{NZOSA}} to handle a class of robust DR-submodular maximization problems. To validate the effectiveness of our proposed algorithms, we conduct experiments on both synthetic and real-world problems. The results demonstrate the superior performance and efficiency of our methods in solving DR-submodular optimization problems.
Cite
Text
Lian et al. "Zeroth-Order Stochastic Approximation Algorithms for DR-Submodular Optimization." Journal of Machine Learning Research, 2024.Markdown
[Lian et al. "Zeroth-Order Stochastic Approximation Algorithms for DR-Submodular Optimization." Journal of Machine Learning Research, 2024.](https://mlanthology.org/jmlr/2024/lian2024jmlr-zerothorder/)BibTeX
@article{lian2024jmlr-zerothorder,
title = {{Zeroth-Order Stochastic Approximation Algorithms for DR-Submodular Optimization}},
author = {Lian, Yuefang and Wang, Xiao and Xu, Dachuan and Zhao, Zhongrui},
journal = {Journal of Machine Learning Research},
year = {2024},
pages = {1-55},
volume = {25},
url = {https://mlanthology.org/jmlr/2024/lian2024jmlr-zerothorder/}
}