AutoLabel: CLIP-Based Framework for Open-Set Video Domain Adaptation
Abstract
Open-set Unsupervised Video Domain Adaptation (OUVDA) deals with the task of adapting an action recognition model from a labelled source domain to an unlabelled target domain that contains "target-private" categories, which are present in the target but absent in the source. In this work we deviate from the prior work of training a specialized open-set classifier or weighted adversarial learning by proposing to use pre-trained Language and Vision Models (CLIP). The CLIP is well suited for OUVDA due to its rich representation and the zero-shot recognition capabilities. However, rejecting target-private instances with the CLIP's zero-shot protocol requires oracle knowledge about the target-private label names. To circumvent the impossibility of the knowledge of label names, we propose AutoLabel that automatically discovers and generates object-centric compositional candidate target-private class names. Despite its simplicity, we show that CLIP when equipped with AutoLabel can satisfactorily reject the target-private instances, thereby facilitating better alignment between the shared classes of the two domains. The code is available.
Cite
Text
Zara et al. "AutoLabel: CLIP-Based Framework for Open-Set Video Domain Adaptation." Conference on Computer Vision and Pattern Recognition, 2023. doi:10.1109/CVPR52729.2023.01107Markdown
[Zara et al. "AutoLabel: CLIP-Based Framework for Open-Set Video Domain Adaptation." Conference on Computer Vision and Pattern Recognition, 2023.](https://mlanthology.org/cvpr/2023/zara2023cvpr-autolabel/) doi:10.1109/CVPR52729.2023.01107BibTeX
@inproceedings{zara2023cvpr-autolabel,
title = {{AutoLabel: CLIP-Based Framework for Open-Set Video Domain Adaptation}},
author = {Zara, Giacomo and Roy, Subhankar and Rota, Paolo and Ricci, Elisa},
booktitle = {Conference on Computer Vision and Pattern Recognition},
year = {2023},
pages = {11504-11513},
doi = {10.1109/CVPR52729.2023.01107},
url = {https://mlanthology.org/cvpr/2023/zara2023cvpr-autolabel/}
}