Action Detection with Improved Dense Trajectories and Sliding Window
Abstract
In this paper we describe an action/interaction detection system based on improved dense trajectories [ 19 ], multiple visual descriptors and bag-of-features representation. Given that the actions/interactions are not mutual exclusive, we train a binary classifier for every predefined action/interaction. We rely on a non-overlapped temporal sliding window to enable the temporal localization. We have tested our system in ChaLearn Looking at People Challenge 2014 Track 2 dataset [ 1 , 2 ]. We obtained 0.4226 average overlap, which is the 3rd place in the track of the challenge. Finally, we provide an extensive analysis of the performance of this system on different actions and provide possible ways to improve a general action detection system.
Cite
Text
Shu et al. "Action Detection with Improved Dense Trajectories and Sliding Window." European Conference on Computer Vision Workshops, 2014. doi:10.1007/978-3-319-16178-5_38Markdown
[Shu et al. "Action Detection with Improved Dense Trajectories and Sliding Window." European Conference on Computer Vision Workshops, 2014.](https://mlanthology.org/eccvw/2014/shu2014eccvw-action/) doi:10.1007/978-3-319-16178-5_38BibTeX
@inproceedings{shu2014eccvw-action,
title = {{Action Detection with Improved Dense Trajectories and Sliding Window}},
author = {Shu, Zhixin and Yun, Kiwon and Samaras, Dimitris},
booktitle = {European Conference on Computer Vision Workshops},
year = {2014},
pages = {541-551},
doi = {10.1007/978-3-319-16178-5_38},
url = {https://mlanthology.org/eccvw/2014/shu2014eccvw-action/}
}