Practical $0.385$-Approximation for Submodular Maximization Subject to a Cardinality Constraint
Abstract
Non-monotone constrained submodular maximization plays a crucial role in various machine learning applications. However, existing algorithms often struggle with a trade-off between approximation guarantees and practical efficiency. The current state-of-the-art is a recent $0.401$-approximation algorithm, but its computational complexity makes it highly impractical. The best practical algorithms for the problem only guarantee $1/e$-approximation. In this work, we present a novel algorithm for submodular maximization subject to a cardinality constraint that combines a guarantee of $0.385$-approximation with a low and practical query complexity of $O(n+k^2)$. Furthermore, we evaluate our algorithm's performance through extensive machine learning applications, including Movie Recommendation, Image Summarization, and more. These evaluations demonstrate the efficacy of our approach.
Cite
Text
Tukan et al. "Practical $0.385$-Approximation for Submodular Maximization Subject to a Cardinality Constraint." Neural Information Processing Systems, 2024. doi:10.52202/079017-1621Markdown
[Tukan et al. "Practical $0.385$-Approximation for Submodular Maximization Subject to a Cardinality Constraint." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/tukan2024neurips-practical/) doi:10.52202/079017-1621BibTeX
@inproceedings{tukan2024neurips-practical,
title = {{Practical $0.385$-Approximation for Submodular Maximization Subject to a Cardinality Constraint}},
author = {Tukan, Murad and Mualem, Loay and Feldman, Moran},
booktitle = {Neural Information Processing Systems},
year = {2024},
doi = {10.52202/079017-1621},
url = {https://mlanthology.org/neurips/2024/tukan2024neurips-practical/}
}