Discriminative Learning for Differing Training and Test Distributions
Abstract
We address classification problems for which the training instances are governed by a distribution that is allowed to differ arbitrarily from the test distribution: problems also referred to as classification under covariate shift. We derive a solution that is purely discriminative: neither training nor test distribution are modeled explicitly. We formulate the general problem of learning under covariate shift as an integrated optimization problem. We derive a kernel logistic regression classifier for differing training and test distributions.
Cite
Text
Bickel et al. "Discriminative Learning for Differing Training and Test Distributions." International Conference on Machine Learning, 2007. doi:10.1145/1273496.1273507Markdown
[Bickel et al. "Discriminative Learning for Differing Training and Test Distributions." International Conference on Machine Learning, 2007.](https://mlanthology.org/icml/2007/bickel2007icml-discriminative/) doi:10.1145/1273496.1273507BibTeX
@inproceedings{bickel2007icml-discriminative,
title = {{Discriminative Learning for Differing Training and Test Distributions}},
author = {Bickel, Steffen and Brückner, Michael and Scheffer, Tobias},
booktitle = {International Conference on Machine Learning},
year = {2007},
pages = {81-88},
doi = {10.1145/1273496.1273507},
url = {https://mlanthology.org/icml/2007/bickel2007icml-discriminative/}
}