Black Box Submodular Maximization: Discrete and Continuous Settings

Abstract

In this paper, we consider the problem of black box continuous submodular maximization where we only have access to the function values and no information about the derivatives is provided. For a monotone and continuous DR-submodular function, and subject to a bounded convex body constraint, we propose Black-box Continuous Greedy, a derivative-free algorithm that provably achieves the tight $[(1-1/e)OPT-\epsilon]$ approximation guarantee with $O(d/\epsilon^3)$ function evaluations. We then extend our result to the stochastic setting where function values are subject to stochastic zero-mean noise. It is through this stochastic generalization that we revisit the discrete submodular maximization problem and use the multi-linear extension as a bridge between discrete and continuous settings. Finally, we extensively evaluate the performance of our algorithm on continuous and discrete submodular objective functions using both synthetic and real data.

Cite

Text

Chen et al. "Black Box Submodular Maximization: Discrete and Continuous Settings." Artificial Intelligence and Statistics, 2020.

Markdown

[Chen et al. "Black Box Submodular Maximization: Discrete and Continuous Settings." Artificial Intelligence and Statistics, 2020.](https://mlanthology.org/aistats/2020/chen2020aistats-black/)

BibTeX

@inproceedings{chen2020aistats-black,
  title     = {{Black Box Submodular Maximization: Discrete and Continuous Settings}},
  author    = {Chen, Lin and Zhang, Mingrui and Hassani, Hamed and Karbasi, Amin},
  booktitle = {Artificial Intelligence and Statistics},
  year      = {2020},
  pages     = {1058-1070},
  volume    = {108},
  url       = {https://mlanthology.org/aistats/2020/chen2020aistats-black/}
}