Computing Regularization Paths for Learning Multiple Kernels
Abstract
The problem of learning a sparse conic combination of kernel functions or kernel matrices for classification or regression can be achieved via the regularization by a block 1-norm [1]. In this paper, we present an al- gorithm that computes the entire regularization path for these problems. The path is obtained by using numerical continuation techniques, and involves a running time complexity that is a constant times the complex- ity of solving the problem for one value of the regularization parameter. Working in the setting of kernel linear regression and kernel logistic re- gression, we show empirically that the effect of the block 1-norm reg- ularization differs notably from the (non-block) 1-norm regularization commonly used for variable selection, and that the regularization path is of particular value in the block case.
Cite
Text
Bach et al. "Computing Regularization Paths for Learning Multiple Kernels." Neural Information Processing Systems, 2004.Markdown
[Bach et al. "Computing Regularization Paths for Learning Multiple Kernels." Neural Information Processing Systems, 2004.](https://mlanthology.org/neurips/2004/bach2004neurips-computing/)BibTeX
@inproceedings{bach2004neurips-computing,
title = {{Computing Regularization Paths for Learning Multiple Kernels}},
author = {Bach, Francis R. and Thibaux, Romain and Jordan, Michael I.},
booktitle = {Neural Information Processing Systems},
year = {2004},
pages = {73-80},
url = {https://mlanthology.org/neurips/2004/bach2004neurips-computing/}
}