Meta-DMoE: Adapting to Domain Shift by Meta-Distillation from Mixture-of-Experts

Abstract

In this paper, we tackle the problem of domain shift. Most existing methods perform training on multiple source domains using a single model, and the same trained model is used on all unseen target domains. Such solutions are sub-optimal as each target domain exhibits its own specialty, which is not adapted. Furthermore, expecting single-model training to learn extensive knowledge from multiple source domains is counterintuitive. The model is more biased toward learning only domain-invariant features and may result in negative knowledge transfer. In this work, we propose a novel framework for unsupervised test-time adaptation, which is formulated as a knowledge distillation process to address domain shift. Specifically, we incorporate Mixture-of-Experts (MoE) as teachers, where each expert is separately trained on different source domains to maximize their specialty. Given a test-time target domain, a small set of unlabeled data is sampled to query the knowledge from MoE. As the source domains are correlated to the target domains, a transformer-based aggregator then combines the domain knowledge by examining the interconnection among them. The output is treated as a supervision signal to adapt a student prediction network toward the target domain. We further employ meta-learning to enforce the aggregator to distill positive knowledge and the student network to achieve fast adaptation. Extensive experiments demonstrate that the proposed method outperforms the state-of-the-art and validates the effectiveness of each proposed component. Our code is available at https://github.com/n3il666/Meta-DMoE.

Cite

Text

Zhong et al. "Meta-DMoE: Adapting to Domain Shift by Meta-Distillation from Mixture-of-Experts." Neural Information Processing Systems, 2022.

Markdown

[Zhong et al. "Meta-DMoE: Adapting to Domain Shift by Meta-Distillation from Mixture-of-Experts." Neural Information Processing Systems, 2022.](https://mlanthology.org/neurips/2022/zhong2022neurips-metadmoe/)

BibTeX

@inproceedings{zhong2022neurips-metadmoe,
  title     = {{Meta-DMoE: Adapting to Domain Shift by Meta-Distillation from Mixture-of-Experts}},
  author    = {Zhong, Tao and Chi, Zhixiang and Gu, Li and Wang, Yang and Yu, Yuanhao and Tang, Jin},
  booktitle = {Neural Information Processing Systems},
  year      = {2022},
  url       = {https://mlanthology.org/neurips/2022/zhong2022neurips-metadmoe/}
}