Efficient Action Detection in Untrimmed Videos via Multi-Task Learning

Abstract

This paper studies the joint learning of action recognition and temporal localization in long, untrimmed videos. We employ a multi-task learning framework that performs the three highly related steps of action proposal, action recognition, and action localization refinement in parallel instead of the standard sequential pipeline that performs the steps in order. We develop a novel temporal actionness regression module that estimates what proportion of a clip contains action. We use it for temporal localization but it could have other applications like video retrieval, surveillance, summarization, etc. We also introduce random shear augmentation during training to simulate viewpoint change. We evaluate our framework on three popular video benchmarks. Results demonstrate that our joint model is efficient in terms of storage and computation in that we do not need to compute and cache dense trajectory features, and that it is several times faster than its sequential ConvNets counterpart. Yet, despite being more efficient, it outperforms stateof-the-art methods with respect to accuracy.

Cite

Text

Zhu and Newsam. "Efficient Action Detection in Untrimmed Videos via Multi-Task Learning." IEEE/CVF Winter Conference on Applications of Computer Vision, 2017. doi:10.1109/WACV.2017.29

Markdown

[Zhu and Newsam. "Efficient Action Detection in Untrimmed Videos via Multi-Task Learning." IEEE/CVF Winter Conference on Applications of Computer Vision, 2017.](https://mlanthology.org/wacv/2017/zhu2017wacv-efficient/) doi:10.1109/WACV.2017.29

BibTeX

@inproceedings{zhu2017wacv-efficient,
  title     = {{Efficient Action Detection in Untrimmed Videos via Multi-Task Learning}},
  author    = {Zhu, Yi and Newsam, Shawn D.},
  booktitle = {IEEE/CVF Winter Conference on Applications of Computer Vision},
  year      = {2017},
  pages     = {197-206},
  doi       = {10.1109/WACV.2017.29},
  url       = {https://mlanthology.org/wacv/2017/zhu2017wacv-efficient/}
}