Cardinality Constrained Submodular Maximization for Random Streams

Abstract

We consider the problem of maximizing submodular functions in single-pass streaming and secretaries-with-shortlists models, both with random arrival order.For cardinality constrained monotone functions, Agrawal, Shadravan, and Stein~\cite{SMC19} gave a single-pass $(1-1/e-\varepsilon)$-approximation algorithm using only linear memory, but their exponential dependence on $\varepsilon$ makes it impractical even for $\varepsilon=0.1$.We simplify both the algorithm and the analysis, obtaining an exponential improvement in the $\varepsilon$-dependence (in particular, $O(k/\varepsilon)$ memory).Extending these techniques, we also give a simple $(1/e-\varepsilon)$-approximation for non-monotone functions in $O(k/\varepsilon)$ memory. For the monotone case, we also give a corresponding unconditional hardness barrier of $1-1/e+\varepsilon$ for single-pass algorithms in randomly ordered streams, even assuming unlimited computation. Finally, we show that the algorithms are simple to implement and work well on real world datasets.

Cite

Text

Liu et al. "Cardinality Constrained Submodular Maximization for Random Streams." Neural Information Processing Systems, 2021.

Markdown

[Liu et al. "Cardinality Constrained Submodular Maximization for Random Streams." Neural Information Processing Systems, 2021.](https://mlanthology.org/neurips/2021/liu2021neurips-cardinality/)

BibTeX

@inproceedings{liu2021neurips-cardinality,
  title     = {{Cardinality Constrained Submodular Maximization for Random Streams}},
  author    = {Liu, Paul and Rubinstein, Aviad and Vondrak, Jan and Zhao, Junyao},
  booktitle = {Neural Information Processing Systems},
  year      = {2021},
  url       = {https://mlanthology.org/neurips/2021/liu2021neurips-cardinality/}
}