Structured Sparsity and Generalization

Abstract

We present a data dependent generalization bound for a large class of regularized algorithms which implement structured sparsity constraints. The bound can be applied to standard squared-norm regularization, the Lasso, the group Lasso, some versions of the group Lasso with overlapping groups, multiple kernel learning and other regularization schemes. In all these cases competitive results are obtained. A novel feature of our bound is that it can be applied in an infinite dimensional setting such as the Lasso in a separable Hilbert space or multiple kernel learning with a countable number of kernels.

Cite

Text

Maurer and Pontil. "Structured Sparsity and Generalization." Journal of Machine Learning Research, 2012.

Markdown

[Maurer and Pontil. "Structured Sparsity and Generalization." Journal of Machine Learning Research, 2012.](https://mlanthology.org/jmlr/2012/maurer2012jmlr-structured/)

BibTeX

@article{maurer2012jmlr-structured,
  title     = {{Structured Sparsity and Generalization}},
  author    = {Maurer, Andreas and Pontil, Massimiliano},
  journal   = {Journal of Machine Learning Research},
  year      = {2012},
  pages     = {671-690},
  volume    = {13},
  url       = {https://mlanthology.org/jmlr/2012/maurer2012jmlr-structured/}
}