Scaling Submodular Maximization via Pruned Submodularity Graphs

Abstract

We propose a new random pruning method (called "submodular sparsification (SS)") to reduce the cost of submodular maximization. The pruning is applied via a "submodularity graph" over the $n$ ground elements, where each directed edge is associated with a pairwise dependency defined by the submodular function. In each step, SS prunes a $1-1/\sqrt{c}$ (for $c>1$) fraction of the nodes using weights on edges computed based on only a small number ($O(\log n)$) of randomly sampled nodes. The algorithm requires $\log_{\sqrt{c}}n$ steps with a small and highly parallelizable per-step computation. An accuracy-speed tradeoff parameter $c$, set as $c = 8$, leads to a fast shrink rate $\sqrt{2}/4$ and small iteration complexity $\log_{2\sqrt{2}}n$. Analysis shows that w.h.p., the greedy algorithm on the pruned set of size $O(\log^2 n)$ can achieve a guarantee similar to that of processing the original dataset. In news and video summarization tasks, SS is able to substantially reduce both computational costs and memory usage, while maintaining (or even slightly exceeding) the quality of the original (and much more costly) greedy algorithm.

Cite

Text

Zhou et al. "Scaling Submodular Maximization via Pruned Submodularity Graphs." International Conference on Artificial Intelligence and Statistics, 2017.

Markdown

[Zhou et al. "Scaling Submodular Maximization via Pruned Submodularity Graphs." International Conference on Artificial Intelligence and Statistics, 2017.](https://mlanthology.org/aistats/2017/zhou2017aistats-scaling/)

BibTeX

@inproceedings{zhou2017aistats-scaling,
  title     = {{Scaling Submodular Maximization via Pruned Submodularity Graphs}},
  author    = {Zhou, Tianyi and Ouyang, Hua and Bilmes, Jeff A. and Chang, Yi and Guestrin, Carlos},
  booktitle = {International Conference on Artificial Intelligence and Statistics},
  year      = {2017},
  pages     = {316-324},
  url       = {https://mlanthology.org/aistats/2017/zhou2017aistats-scaling/}
}