K-Support and Ordered Weighted Sparsity for Overlapping Groups: Hardness and Algorithms
Abstract
The k-support and OWL norms generalize the l1 norm, providing better prediction accuracy and better handling of correlated variables. We study the norms obtained from extending the k-support norm and OWL norms to the setting in which there are overlapping groups. The resulting norms are in general NP-hard to compute, but they are tractable for certain collections of groups. To demonstrate this fact, we develop a dynamic program for the problem of projecting onto the set of vectors supported by a fixed number of groups. Our dynamic program utilizes tree decompositions and its complexity scales with the treewidth. This program can be converted to an extended formulation which, for the associated group structure, models the k-group support norms and an overlapping group variant of the ordered weighted l1 norm. Numerical results demonstrate the efficacy of the new penalties.
Cite
Text
Lim and Wright. "K-Support and Ordered Weighted Sparsity for Overlapping Groups: Hardness and Algorithms." Neural Information Processing Systems, 2017.Markdown
[Lim and Wright. "K-Support and Ordered Weighted Sparsity for Overlapping Groups: Hardness and Algorithms." Neural Information Processing Systems, 2017.](https://mlanthology.org/neurips/2017/lim2017neurips-ksupport/)BibTeX
@inproceedings{lim2017neurips-ksupport,
title = {{K-Support and Ordered Weighted Sparsity for Overlapping Groups: Hardness and Algorithms}},
author = {Lim, Cong Han and Wright, Stephen},
booktitle = {Neural Information Processing Systems},
year = {2017},
pages = {284-292},
url = {https://mlanthology.org/neurips/2017/lim2017neurips-ksupport/}
}