Discriminative Learning Under Covariate Shift

Abstract

We address classification problems for which the training instances are governed by an input distribution that is allowed to differ arbitrarily from the test distribution---problems also referred to as classification under covariate shift. We derive a solution that is purely discriminative: neither training nor test distribution are modeled explicitly. The problem of learning under covariate shift can be written as an integrated optimization problem. Instantiating the general optimization problem leads to a kernel logistic regression and an exponential model classifier for covariate shift. The optimization problem is convex under certain conditions; our findings also clarify the relationship to the known kernel mean matching procedure. We report on experiments on problems of spam filtering, text classification, and landmine detection.

Cite

Text

Bickel et al. "Discriminative Learning Under Covariate Shift." Journal of Machine Learning Research, 2009.

Markdown

[Bickel et al. "Discriminative Learning Under Covariate Shift." Journal of Machine Learning Research, 2009.](https://mlanthology.org/jmlr/2009/bickel2009jmlr-discriminative/)

BibTeX

@article{bickel2009jmlr-discriminative,
  title     = {{Discriminative Learning Under Covariate Shift}},
  author    = {Bickel, Steffen and Brückner, Michael and Scheffer, Tobias},
  journal   = {Journal of Machine Learning Research},
  year      = {2009},
  pages     = {2137-2155},
  volume    = {10},
  url       = {https://mlanthology.org/jmlr/2009/bickel2009jmlr-discriminative/}
}