Extended Deep Submodular Functions
Abstract
We introduce a novel representation of monotone set functions called Extended Deep Submodular functions (EDSFs), which are neural network-representable. EDSFs serve as an extension of Deep Submodular Functions (DSFs), inheriting crucial properties from DSFs while addressing innate limitations. It is known that DSFs can represent a limiting subset of submodular functions. In contrast, we establish that EDSFs possess the capability to represent all monotone submodular functions, a notable enhancement compared to DSFs. Furthermore, our findings demonstrate that EDSFs can represent any monotone set function, indicating the family of EDSFs is equivalent to the family of all monotone set functions. Additionally, we prove that EDSFs maintain the concavity inherent in DSFs when the components of the input vector are non-negative real numbers—an essential feature in certain combinatorial optimization problems. Through extensive experiments, we demonstrate that EDSFs exhibit significantly lower empirical generalization error in representing and learning coverage and cut functions compared to existing baselines, such as DSFs, Deep Sets, and Set Transformers.
Cite
Text
Hosseini et al. "Extended Deep Submodular Functions." Transactions on Machine Learning Research, 2024.Markdown
[Hosseini et al. "Extended Deep Submodular Functions." Transactions on Machine Learning Research, 2024.](https://mlanthology.org/tmlr/2024/hosseini2024tmlr-extended/)BibTeX
@article{hosseini2024tmlr-extended,
title = {{Extended Deep Submodular Functions}},
author = {Hosseini, Seyed Mohammad and Jamshidi, Arash and Noormousavi, Seyed Mahdi and Siavoshani, Mahdi and Omidvar, Naeimeh},
journal = {Transactions on Machine Learning Research},
year = {2024},
url = {https://mlanthology.org/tmlr/2024/hosseini2024tmlr-extended/}
}