CycDA: Unsupervised Cycle Domain Adaptation to Learn from Image to Video
Abstract
Although action recognition has achieved impressive results over recent years, both collection and annotation of video training data are still time-consuming and cost intensive. Therefore, image-to-video adaptation has been proposed to exploit labeling-free web image source for adapting on unlabeled target videos. This poses two major challenges: (1) spatial domain shift between web images and video frames; (2) modality gap between image and video data. To address these challenges, we propose Cycle Domain Adaptation (CycDA), a cycle-based approach for unsupervised image-to-video domain adaptation. We leverage the joint spatial information in images and videos on the one hand and, on the other hand, train an independent spatio-temporal model to bridge the modality gap. We alternate between the spatial and spatio-temporal learning with knowledge transfer between the two in each cycle. We evaluate our approach on benchmark datasets for image-to-video as well as for mixed-source domain adaptation achieving state-of-the-art results and demonstrating the benefits of our cyclic adaptation.
Cite
Text
Lin et al. "CycDA: Unsupervised Cycle Domain Adaptation to Learn from Image to Video." Proceedings of the European Conference on Computer Vision (ECCV), 2022. doi:10.1007/978-3-031-20062-5_40Markdown
[Lin et al. "CycDA: Unsupervised Cycle Domain Adaptation to Learn from Image to Video." Proceedings of the European Conference on Computer Vision (ECCV), 2022.](https://mlanthology.org/eccv/2022/lin2022eccv-cycda/) doi:10.1007/978-3-031-20062-5_40BibTeX
@inproceedings{lin2022eccv-cycda,
title = {{CycDA: Unsupervised Cycle Domain Adaptation to Learn from Image to Video}},
author = {Lin, Wei and Kukleva, Anna and Sun, Kunyang and Possegger, Horst and Kuehne, Hilde and Bischof, Horst},
booktitle = {Proceedings of the European Conference on Computer Vision (ECCV)},
year = {2022},
doi = {10.1007/978-3-031-20062-5_40},
url = {https://mlanthology.org/eccv/2022/lin2022eccv-cycda/}
}