Unsupervised Multi-Source Domain Adaptation for Regression

Abstract

We consider the problem of unsupervised domain adaptation from multiple sources in a regression setting. We propose in this work an original method to take benefit of different sources using a weighted combination of the sources. For this purpose, we define a new measure of similarity between probabilities for domain adaptation which we call hypothesis-discrepancy. We then prove a new bound for unsupervised domain adaptation combining multiple sources. We derive from this bound a novel adversarial domain adaptation algorithm adjusting weights given to each source, ensuring that sources related to the target receive higher weights. We finally evaluate our method on different public datasets and compare it to other domain adaptation baselines to demonstrate the improvement for regression tasks.

Cite

Text

Richard et al. "Unsupervised Multi-Source Domain Adaptation for Regression." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2020. doi:10.1007/978-3-030-67658-2_23

Markdown

[Richard et al. "Unsupervised Multi-Source Domain Adaptation for Regression." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2020.](https://mlanthology.org/ecmlpkdd/2020/richard2020ecmlpkdd-unsupervised/) doi:10.1007/978-3-030-67658-2_23

BibTeX

@inproceedings{richard2020ecmlpkdd-unsupervised,
  title     = {{Unsupervised Multi-Source Domain Adaptation for Regression}},
  author    = {Richard, Guillaume and de Mathelin, Antoine and Hébrail, Georges and Mougeot, Mathilde and Vayatis, Nicolas},
  booktitle = {European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases},
  year      = {2020},
  pages     = {395-411},
  doi       = {10.1007/978-3-030-67658-2_23},
  url       = {https://mlanthology.org/ecmlpkdd/2020/richard2020ecmlpkdd-unsupervised/}
}