Kernel Basis Pursuit
Abstract
Estimating a non-uniformly sampled function from a set of learning points is a classical regression problem. Kernel methods have been widely used in this context, but every problem leads to two major tasks: optimizing the kernel and setting the fitness-regularization compromise. This article presents a new method to estimate a function from noisy learning points in the context of RKHS (Reproducing Kernel Hilbert Space). We introduce the Kernel Basis Pursuit algorithm, which enables us to build a ℓ_1-regularized-multiple-kernel estimator. The general idea is to decompose the function to learn on a sparse-optimal set of spanning functions. Our implementation relies on the Least Absolute Shrinkage and Selection Operator (LASSO) formulation and on the Least Angle Regression (LARS) solver. The computation of the full regularization path, through the LARS, will enable us to propose new adaptive criteria to find an optimal fitness-regularization compromise. Finally, we aim at proposing a fast parameter-free method to estimate non-uniform-sampled functions.
Cite
Text
Guigue et al. "Kernel Basis Pursuit." European Conference on Machine Learning, 2005. doi:10.1007/11564096_18Markdown
[Guigue et al. "Kernel Basis Pursuit." European Conference on Machine Learning, 2005.](https://mlanthology.org/ecmlpkdd/2005/guigue2005ecml-kernel/) doi:10.1007/11564096_18BibTeX
@inproceedings{guigue2005ecml-kernel,
title = {{Kernel Basis Pursuit}},
author = {Guigue, Vincent and Rakotomamonjy, Alain and Canu, Stéphane},
booktitle = {European Conference on Machine Learning},
year = {2005},
pages = {146-157},
doi = {10.1007/11564096_18},
url = {https://mlanthology.org/ecmlpkdd/2005/guigue2005ecml-kernel/}
}