AdaME: Adaptive Learning of Multisource Adaptationensembles

Abstract

We present a new adaptive algorithm to build multisource domain adaptation neural networks ensembles. Since the standard convex combination ensembles cannot succeed in this scenario, we present a learnable domain-weighted combination and new learning guarantees based on the deep boosting algorithm. We introduce and analyze a new algorithm, ADAME, for this scenario and show that it benefits from favorable theoretical guarantees, is risk-averse and reduces the worst-case mismatch between the inference and training distributions. We also report the results of several experiments demonstrating its performance in the FMOW-WILDSdataset.

Cite

Text

Yak et al. "AdaME: Adaptive Learning of Multisource Adaptationensembles." NeurIPS 2022 Workshops: DistShift, 2022.

Markdown

[Yak et al. "AdaME: Adaptive Learning of Multisource Adaptationensembles." NeurIPS 2022 Workshops: DistShift, 2022.](https://mlanthology.org/neuripsw/2022/yak2022neuripsw-adame/)

BibTeX

@inproceedings{yak2022neuripsw-adame,
  title     = {{AdaME: Adaptive Learning of Multisource Adaptationensembles}},
  author    = {Yak, Scott and Gonzalvo, Javier and Mohri, Mehryar and Cortes, Corinna},
  booktitle = {NeurIPS 2022 Workshops: DistShift},
  year      = {2022},
  url       = {https://mlanthology.org/neuripsw/2022/yak2022neuripsw-adame/}
}