On the Vgamma Dimension for Regression in Reproducing Kernel Hilbert Spaces
Abstract
This paper presents a computation of the V _γ dimension for regression in bounded subspaces of Reproducing Kernel Hilbert Spaces (RKHS) for the Support Vector Machine (SVM) regression ε -insensitive loss function L _ε, and general L _p loss functions. Finiteness of the V _γ dimension is shown, which also proves uniform convergence in probability for regression machines in RKHS subspaces that use the L _ε or general L _p loss functions. This paper presents a novel proof of this result. It also presents a computation of an upper bound of the V _γ dimension under some conditions, that leads to an approach for the estimation of the empirical V _γ dimension given a set of training data.
Cite
Text
Evgeniou and Pontil. "On the Vgamma Dimension for Regression in Reproducing Kernel Hilbert Spaces." International Conference on Algorithmic Learning Theory, 1999. doi:10.1007/3-540-46769-6_9Markdown
[Evgeniou and Pontil. "On the Vgamma Dimension for Regression in Reproducing Kernel Hilbert Spaces." International Conference on Algorithmic Learning Theory, 1999.](https://mlanthology.org/alt/1999/evgeniou1999alt-vgamma/) doi:10.1007/3-540-46769-6_9BibTeX
@inproceedings{evgeniou1999alt-vgamma,
title = {{On the Vgamma Dimension for Regression in Reproducing Kernel Hilbert Spaces}},
author = {Evgeniou, Theodoros and Pontil, Massimiliano},
booktitle = {International Conference on Algorithmic Learning Theory},
year = {1999},
pages = {106-117},
doi = {10.1007/3-540-46769-6_9},
url = {https://mlanthology.org/alt/1999/evgeniou1999alt-vgamma/}
}