The Local Rademacher Complexity of Lp-Norm Multiple Kernel Learning
Abstract
We derive an upper bound on the local Rademacher complexity of Lp-norm multiple kernel learning, which yields a tighter excess risk bound than global approaches. Previous local approaches analyzed the case p=1 only while our analysis covers all cases $1\leq p\leq\infty$, assuming the different feature mappings corresponding to the different kernels to be uncorrelated. We also show a lower bound that shows that the bound is tight, and derive consequences regarding excess loss, namely fast convergence rates of the order $O(n^{-\frac{\alpha}{1+\alpha}})$, where $\alpha$ is the minimum eigenvalue decay rate of the individual kernels.
Cite
Text
Kloft and Blanchard. "The Local Rademacher Complexity of Lp-Norm Multiple Kernel Learning." Neural Information Processing Systems, 2011.Markdown
[Kloft and Blanchard. "The Local Rademacher Complexity of Lp-Norm Multiple Kernel Learning." Neural Information Processing Systems, 2011.](https://mlanthology.org/neurips/2011/kloft2011neurips-local/)BibTeX
@inproceedings{kloft2011neurips-local,
title = {{The Local Rademacher Complexity of Lp-Norm Multiple Kernel Learning}},
author = {Kloft, Marius and Blanchard, Gilles},
booktitle = {Neural Information Processing Systems},
year = {2011},
pages = {2438-2446},
url = {https://mlanthology.org/neurips/2011/kloft2011neurips-local/}
}