Feed-Forward Source-Free Latent Domain Adaptation via Cross-Attention
Abstract
We study the highly practical but comparatively under-studied problem of latent-domain adaptation, where a source model should be adapted to a target dataset that contains a mixture of unlabelled domain-relevant and domain-irrelevant examples. Motivated by the requirements for data privacy and the need for embedded and resource-constrained devices of all kinds to adapt to local data distributions, we further focus on the setting of feed-forward source-free domain adaptation, where adaptation should not require access to the source dataset, and also be back propagation-free. Our solution is to meta-learn a network capable of embedding the mixed-relevance target dataset and dynamically adapting inference for target examples using cross-attention. The resulting framework leads to consistent strong improvements.
Cite
Text
Bohdal et al. "Feed-Forward Source-Free Latent Domain Adaptation via Cross-Attention." ICML 2022 Workshops: Pre-Training, 2022.Markdown
[Bohdal et al. "Feed-Forward Source-Free Latent Domain Adaptation via Cross-Attention." ICML 2022 Workshops: Pre-Training, 2022.](https://mlanthology.org/icmlw/2022/bohdal2022icmlw-feedforward/)BibTeX
@inproceedings{bohdal2022icmlw-feedforward,
title = {{Feed-Forward Source-Free Latent Domain Adaptation via Cross-Attention}},
author = {Bohdal, Ondrej and Li, Da and Hu, Shell Xu and Hospedales, Timothy},
booktitle = {ICML 2022 Workshops: Pre-Training},
year = {2022},
url = {https://mlanthology.org/icmlw/2022/bohdal2022icmlw-feedforward/}
}