Efficient Submodular Optimization Under Noise: Local Search Is Robust

Abstract

The problem of monotone submodular maximization has been studied extensively due to its wide range of applications. However, there are cases where one can only access the objective function in a distorted or noisy form because of the uncertain nature or the errors involved in the evaluation. This paper considers the problem of constrained monotone submodular maximization with noisy oracles introduced by Hassidim and Singer (2017). For a cardinality constraint, we propose an algorithm achieving a near-optimal (1-1/e-O(epsilon))-approximation guarantee (for arbitrary epsilon > 0) with only a polynomial number of queries to the noisy value oracle, which improves the exponential query complexity of Singer and Hassidim (2018). For general matroid constraints, we show the first constant approximation algorithm in the presence of noise. Our main approaches are to design a novel local search framework that can handle the effect of noise and to construct certain smoothing surrogate functions for noise reduction.

Cite

Text

Huang et al. "Efficient Submodular Optimization Under Noise: Local Search Is Robust." Neural Information Processing Systems, 2022.

Markdown

[Huang et al. "Efficient Submodular Optimization Under Noise: Local Search Is Robust." Neural Information Processing Systems, 2022.](https://mlanthology.org/neurips/2022/huang2022neurips-efficient/)

BibTeX

@inproceedings{huang2022neurips-efficient,
  title     = {{Efficient Submodular Optimization Under Noise: Local Search Is Robust}},
  author    = {Huang, Lingxiao and Wang, Yuyi and Yang, Chunxue and Zhou, Huanjian},
  booktitle = {Neural Information Processing Systems},
  year      = {2022},
  url       = {https://mlanthology.org/neurips/2022/huang2022neurips-efficient/}
}