Breaking Barriers: Combinatorial Algorithms for Non-Monotone Submodular Maximization with Sublinear Adaptivity and $1/e$ Approximation

Abstract

With the rapid growth of data in modern applications, parallel combinatorial algorithms for maximizing non-monotone submodular functions have gained significant attention. In the parallel computation setting, the state-of-the-art approximation ratio of $1/e$ is achieved by a continuous algorithm (Ene & Nguyen, 2020) with adaptivity $\mathcal O (log(n))$. In this work, we focus on size constraints and present the first combinatorial algorithm matching this bound – a randomized parallel approach achieving $1/e - \epsilon$ approximation ratio. This result bridges the gap between continuous and combinatorial approaches for this problem. As a byproduct, we also develop a simpler $(1/4 - \epsilon)$-approximation algorithm with high probability $(\ge 1 - 1/n)$. Both algorithms achieve $\mathcal O (log(n) log(k))$ adaptivity and $\mathcal O (n log(n) log(k)) query complexity. Empirical results show our algorithms achieve competitive objective values, with the $(1/4 - $\epsilon$)$-approximation algorithm particularly efficient in queries.

Cite

Text

Chen et al. "Breaking Barriers: Combinatorial Algorithms for Non-Monotone Submodular Maximization with Sublinear Adaptivity and $1/e$ Approximation." Proceedings of the 42nd International Conference on Machine Learning, 2025.

Markdown

[Chen et al. "Breaking Barriers: Combinatorial Algorithms for Non-Monotone Submodular Maximization with Sublinear Adaptivity and $1/e$ Approximation." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/chen2025icml-breaking/)

BibTeX

@inproceedings{chen2025icml-breaking,
  title     = {{Breaking Barriers: Combinatorial Algorithms for Non-Monotone Submodular Maximization with Sublinear Adaptivity and $1/e$ Approximation}},
  author    = {Chen, Yixin and Chen, Wenjing and Kuhnle, Alan},
  booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
  year      = {2025},
  pages     = {7840-7879},
  volume    = {267},
  url       = {https://mlanthology.org/icml/2025/chen2025icml-breaking/}
}