Multi-Video Temporal Synchronization by Matching Pose Features of Shared Moving Subjects
Abstract
Collaborative analysis of videos taken by multiple motion cameras from different and time-varying views can help solve many computer vision problems. However, such collaborative analysis usually requires the videos to be temporally synchronized, which can be inaccurate if we solely rely on camera clock. In this paper, we propose to address this problem based on video content. More specifically, if multiple videos cover the same moving persons, these subjects shall exhibit identical pose and pose change at each aligned time point across these videos. Based on this idea, we develop a new Synchronization Network (SynNet) which includes a feature aggregation module, a matching cost volume and several classification layers to infer the time offset between different videos by exploiting view-invariant human pose features. We conduct comprehensive experiments on SYN, SPVideo and MPVideo datasets. The results show that the proposed method can accurately synchronize multiple motion-camera videos collected in real world.
Cite
Text
Wu et al. "Multi-Video Temporal Synchronization by Matching Pose Features of Shared Moving Subjects." IEEE/CVF International Conference on Computer Vision Workshops, 2019. doi:10.1109/ICCVW.2019.00334Markdown
[Wu et al. "Multi-Video Temporal Synchronization by Matching Pose Features of Shared Moving Subjects." IEEE/CVF International Conference on Computer Vision Workshops, 2019.](https://mlanthology.org/iccvw/2019/wu2019iccvw-multivideo/) doi:10.1109/ICCVW.2019.00334BibTeX
@inproceedings{wu2019iccvw-multivideo,
title = {{Multi-Video Temporal Synchronization by Matching Pose Features of Shared Moving Subjects}},
author = {Wu, Xinyi and Wu, Zhenyao and Zhang, Yujun and Ju, Lili and Wang, Song},
booktitle = {IEEE/CVF International Conference on Computer Vision Workshops},
year = {2019},
pages = {2729-2738},
doi = {10.1109/ICCVW.2019.00334},
url = {https://mlanthology.org/iccvw/2019/wu2019iccvw-multivideo/}
}