Improved Approximation Algorithms for $k$-Submodular Maximization via Multilinear Extension

Abstract

We investigate a generalized form of submodular maximization, referred to as $k$-submodular maximization, with applications across the domains of social networks and machine learning. In this work, we propose the multilinear extension of $k$-submodular functions and unified Frank-Wolfe-type frameworks based on that. This continuous framework accommodates 1) monotone or non-monotone functions, and 2) various constraint types including matroid constraints, knapsack constraints, and their combinations. Notably, we attain an asymptotically optimal $1/2$-approximation for monotone $k$-submodular maximization problems with knapsack constraints, surpassing previous $1/3$-approximation results, and a factor-$1/3$ approximation for non-monotone $k$-submodular maximization problems with knapsack constraints and matroid constraints which outperforms previous $0.245$-approximation results. The foundation for our analysis stems from new insights into specific linear and monotone properties pertaining to the multilinear extension.

Cite

Text

Zhou et al. "Improved Approximation Algorithms for $k$-Submodular Maximization via Multilinear Extension." International Conference on Learning Representations, 2025.

Markdown

[Zhou et al. "Improved Approximation Algorithms for $k$-Submodular Maximization via Multilinear Extension." International Conference on Learning Representations, 2025.](https://mlanthology.org/iclr/2025/zhou2025iclr-improved/)

BibTeX

@inproceedings{zhou2025iclr-improved,
  title     = {{Improved Approximation Algorithms for $k$-Submodular Maximization via Multilinear Extension}},
  author    = {Zhou, Huanjian and Huang, Lingxiao and Wang, Baoxiang},
  booktitle = {International Conference on Learning Representations},
  year      = {2025},
  url       = {https://mlanthology.org/iclr/2025/zhou2025iclr-improved/}
}