Multi-Level Consistency Learning for Semi-Supervised Domain Adaptation
Abstract
Semi-supervised domain adaptation (SSDA) aims to apply knowledge learned from a fully labeled source domain to a scarcely labeled target domain. In this paper, we propose a Multi-level Consistency Learning (MCL) framework for SSDA. Specifically, our MCL regularizes the consistency of different views of target domain samples at three levels: (i) at inter-domain level, we robustly and accurately align the source and target domains using a prototype-based optimal transport method that utilizes the pros and cons of different views of target samples; (ii) at intra-domain level, we facilitate the learning of both discriminative and compact target feature representations by proposing a novel class-wise contrastive clustering loss; (iii) at sample level, we follow standard practice and improve the prediction accuracy by conducting a consistency-based self-training. Empirically, we verified the effectiveness of our MCL framework on three popular SSDA benchmarks, i.e., VisDA2017, DomainNet, and Office-Home datasets, and the experimental results demonstrate that our MCL framework achieves the state-of-the-art performance.
Cite
Text
Yan et al. "Multi-Level Consistency Learning for Semi-Supervised Domain Adaptation." International Joint Conference on Artificial Intelligence, 2022. doi:10.24963/IJCAI.2022/213Markdown
[Yan et al. "Multi-Level Consistency Learning for Semi-Supervised Domain Adaptation." International Joint Conference on Artificial Intelligence, 2022.](https://mlanthology.org/ijcai/2022/yan2022ijcai-multi-a/) doi:10.24963/IJCAI.2022/213BibTeX
@inproceedings{yan2022ijcai-multi-a,
title = {{Multi-Level Consistency Learning for Semi-Supervised Domain Adaptation}},
author = {Yan, Zizheng and Wu, Yushuang and Li, Guanbin and Qin, Yipeng and Han, Xiaoguang and Cui, Shuguang},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2022},
pages = {1530-1536},
doi = {10.24963/IJCAI.2022/213},
url = {https://mlanthology.org/ijcai/2022/yan2022ijcai-multi-a/}
}