Near-Optimal Linear Regression Under Distribution Shift
Abstract
Transfer learning is essential when sufficient data comes from the source domain, with scarce labeled data from the target domain. We develop estimators that achieve minimax linear risk for linear regression problems under distribution shift. Our algorithms cover different transfer learning settings including covariate shift and model shift. We also consider when data are generated from either linear or general nonlinear models. We show that linear minimax estimators are within an absolute constant of the minimax risk even among nonlinear estimators for various source/target distributions.
Cite
Text
Lei et al. "Near-Optimal Linear Regression Under Distribution Shift." International Conference on Machine Learning, 2021.Markdown
[Lei et al. "Near-Optimal Linear Regression Under Distribution Shift." International Conference on Machine Learning, 2021.](https://mlanthology.org/icml/2021/lei2021icml-nearoptimal/)BibTeX
@inproceedings{lei2021icml-nearoptimal,
title = {{Near-Optimal Linear Regression Under Distribution Shift}},
author = {Lei, Qi and Hu, Wei and Lee, Jason},
booktitle = {International Conference on Machine Learning},
year = {2021},
pages = {6164-6174},
volume = {139},
url = {https://mlanthology.org/icml/2021/lei2021icml-nearoptimal/}
}