A near Linear Query Lower Bound for Submodular Maximization

Abstract

We revisit the problem of selecting $k$-out-of-$n$ elements with the goal of optimizing an objective function, and ask whether it can be solved approximately with sublinear query complexity. For objective functions that are monotone submodular, [Li, Feldman, Kazemi, Karbasi, NeurIPS’22; Kuhnle, AISTATS’21] gave an $\Omega(n/k)$ query lower bound for approximating to within any constant factor. We strengthen their lower bound to a nearly tight $\tilde{\Omega}(n)$. This lower bound holds even for estimating the value of the optimal subset. When the objective function is additive, we prove that finding an approximately optimal subset still requires near-linear query complexity, but we can estimate the value of the optimal subset in $\tilde{O}(n/k)$ queries, and that this is tight up to polylog factors.

Cite

Text

Peng and Rubinstein. "A near Linear Query Lower Bound for Submodular Maximization." Proceedings of the 42nd International Conference on Machine Learning, 2025.

Markdown

[Peng and Rubinstein. "A near Linear Query Lower Bound for Submodular Maximization." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/peng2025icml-near/)

BibTeX

@inproceedings{peng2025icml-near,
  title     = {{A near Linear Query Lower Bound for Submodular Maximization}},
  author    = {Peng, Binghui and Rubinstein, Aviad},
  booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
  year      = {2025},
  pages     = {48852-48871},
  volume    = {267},
  url       = {https://mlanthology.org/icml/2025/peng2025icml-near/}
}