SSCAP: Self-Supervised Co-Occurrence Action Parsing for Unsupervised Temporal Action Segmentation
Abstract
Temporal action segmentation is a task to classify each frame in the video with an action label. However, it is quite expensive to annotate every frame in a large corpus of videos to construct a comprehensive supervised training dataset. Thus in this work we propose an unsupervised method, namely SSCAP, that operates on a corpus of unlabeled videos and predicts a likely set of temporal segments across the videos. SSCAP leverages Self-Supervised learning to extract distinguishable features and then applies a novel Co-occurrence Action Parsing algorithm to not only capture the correlation among sub-actions underlying the structure of activities, but also estimate the temporal path of the sub-actions in an accurate and general way. We evaluate on both classic datasets (Breakfast, 50Salads) and the emerging fine-grained action dataset (FineGym) with more complex activity structures and similar sub-actions. Results show that SSCAP achieves state-of-the-art performance on all datasets and can even outperform some weakly-supervised approaches, demonstrating its effectiveness and generalizability.
Cite
Text
Wang et al. "SSCAP: Self-Supervised Co-Occurrence Action Parsing for Unsupervised Temporal Action Segmentation." Winter Conference on Applications of Computer Vision, 2022.Markdown
[Wang et al. "SSCAP: Self-Supervised Co-Occurrence Action Parsing for Unsupervised Temporal Action Segmentation." Winter Conference on Applications of Computer Vision, 2022.](https://mlanthology.org/wacv/2022/wang2022wacv-sscap/)BibTeX
@inproceedings{wang2022wacv-sscap,
title = {{SSCAP: Self-Supervised Co-Occurrence Action Parsing for Unsupervised Temporal Action Segmentation}},
author = {Wang, Zhe and Chen, Hao and Li, Xinyu and Liu, Chunhui and Xiong, Yuanjun and Tighe, Joseph and Fowlkes, Charless},
booktitle = {Winter Conference on Applications of Computer Vision},
year = {2022},
pages = {1819-1828},
url = {https://mlanthology.org/wacv/2022/wang2022wacv-sscap/}
}