High-Dimensional Kernel Methods Under Covariate Shift: Data-Dependent Implicit Regularization
Abstract
This paper studies kernel ridge regression in high dimensions under covariate shifts and analyzes the role of importance re-weighting. We first derive the asymptotic expansion of high dimensional kernels under covariate shifts. By a bias-variance decomposition, we theoretically demonstrate that the re-weighting strategy allows for decreasing the variance. For bias, we analyze the regularization of the arbitrary or well-chosen scale, showing that the bias can behave very differently under different regularization scales. In our analysis, the bias and variance can be characterized by the spectral decay of a data-dependent regularized kernel: the original kernel matrix associated with an additional re-weighting matrix, and thus the re-weighting strategy can be regarded as a data-dependent regularization for better understanding. Besides, our analysis provides asymptotic expansion of kernel functions/vectors under covariate shift, which has its own interest.
Cite
Text
Chen et al. "High-Dimensional Kernel Methods Under Covariate Shift: Data-Dependent Implicit Regularization." International Conference on Machine Learning, 2024.Markdown
[Chen et al. "High-Dimensional Kernel Methods Under Covariate Shift: Data-Dependent Implicit Regularization." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/chen2024icml-highdimensional/)BibTeX
@inproceedings{chen2024icml-highdimensional,
title = {{High-Dimensional Kernel Methods Under Covariate Shift: Data-Dependent Implicit Regularization}},
author = {Chen, Yihang and Liu, Fanghui and Suzuki, Taiji and Cevher, Volkan},
booktitle = {International Conference on Machine Learning},
year = {2024},
pages = {7081-7102},
volume = {235},
url = {https://mlanthology.org/icml/2024/chen2024icml-highdimensional/}
}