Exclusive Feature Learning on Arbitrary Structures via $\ell_{1,2}$-Norm

Abstract

Group lasso is widely used to enforce the structural sparsity, which achieves the sparsity at inter-group level. In this paper, we propose a new formulation called ``exclusive group lasso'', which brings out sparsity at intra-group level in the context of feature selection. The proposed exclusive group lasso is applicable on any feature structures, regardless of their overlapping or non-overlapping structures. We give analysis on the properties of exclusive group lasso, and propose an effective iteratively re-weighted algorithm to solve the corresponding optimization problem with rigorous convergence analysis. We show applications of exclusive group lasso for uncorrelated feature selection. Extensive experiments on both synthetic and real-world datasets indicate the good performance of proposed methods.

Cite

Text

Kong et al. "Exclusive Feature Learning on Arbitrary Structures via $\ell_{1,2}$-Norm." Neural Information Processing Systems, 2014.

Markdown

[Kong et al. "Exclusive Feature Learning on Arbitrary Structures via $\ell_{1,2}$-Norm." Neural Information Processing Systems, 2014.](https://mlanthology.org/neurips/2014/kong2014neurips-exclusive/)

BibTeX

@inproceedings{kong2014neurips-exclusive,
  title     = {{Exclusive Feature Learning on Arbitrary Structures via $\ell_{1,2}$-Norm}},
  author    = {Kong, Deguang and Fujimaki, Ryohei and Liu, Ji and Nie, Feiping and Ding, Chris},
  booktitle = {Neural Information Processing Systems},
  year      = {2014},
  pages     = {1655-1663},
  url       = {https://mlanthology.org/neurips/2014/kong2014neurips-exclusive/}
}