Fast Kernel Methods for Generic Lipschitz Losses via $p$-Sparsified Sketches
Abstract
Kernel methods are learning algorithms that enjoy solid theoretical foundations while suffering from important computational limitations. Sketching, which consists in looking for solutions among a subspace of reduced dimension, is a well-studied approach to alleviate these computational burdens. However, statistically-accurate sketches, such as the Gaussian one, usually contain few null entries, such that their application to kernel methods and their non-sparse Gram matrices remains slow in practice. In this paper, we show that sparsified Gaussian (and Rademacher) sketches still produce theoretically-valid approximations while allowing for important time and space savings thanks to an efficient \emph{decomposition trick}. To support our method, we derive excess risk bounds for both single and multiple output kernel problems, with generic Lipschitz losses, hereby providing new guarantees for a wide range of applications, from robust regression to multiple quantile regression. Our theoretical results are complemented with experiments showing the empirical superiority of our approach over state-of-the-art sketching methods.
Cite
Text
El Ahmad et al. "Fast Kernel Methods for Generic Lipschitz Losses via $p$-Sparsified Sketches." Transactions on Machine Learning Research, 2023.Markdown
[El Ahmad et al. "Fast Kernel Methods for Generic Lipschitz Losses via $p$-Sparsified Sketches." Transactions on Machine Learning Research, 2023.](https://mlanthology.org/tmlr/2023/ahmad2023tmlr-fast/)BibTeX
@article{ahmad2023tmlr-fast,
title = {{Fast Kernel Methods for Generic Lipschitz Losses via $p$-Sparsified Sketches}},
author = {El Ahmad, Tamim and Laforgue, Pierre and d'Alché-Buc, Florence},
journal = {Transactions on Machine Learning Research},
year = {2023},
url = {https://mlanthology.org/tmlr/2023/ahmad2023tmlr-fast/}
}