Local Kernel Ridge Regression for Scalable, Interpolating, Continuous Regression
Abstract
We study a localized version of kernel ridge regression that can continuously, smoothly interpolate the underlying function values which are highly non-linear with observed data points. This new method can deal with the data of which (a) local density is highly uneven and (b) the function values change dramatically in certain small but unknown regions. By introducing a new rank-based interpolation scheme, the interpolated values provided by our local method continuously vary with query points. Our method is scalable by avoiding the full matrix inverse, compared with traditional kernel ridge regression.
Cite
Text
Han et al. "Local Kernel Ridge Regression for Scalable, Interpolating, Continuous Regression." Transactions on Machine Learning Research, 2022.Markdown
[Han et al. "Local Kernel Ridge Regression for Scalable, Interpolating, Continuous Regression." Transactions on Machine Learning Research, 2022.](https://mlanthology.org/tmlr/2022/han2022tmlr-local/)BibTeX
@article{han2022tmlr-local,
title = {{Local Kernel Ridge Regression for Scalable, Interpolating, Continuous Regression}},
author = {Han, Mingxuan and Ye, Chenglong and Phillips, Jeff},
journal = {Transactions on Machine Learning Research},
year = {2022},
url = {https://mlanthology.org/tmlr/2022/han2022tmlr-local/}
}