Cooking Activities Recognition in Egocentric Videos Using Hand Shape Feature with Openpose

Abstract

Recently people can easily obtain wearable cameras and it is easy to record egocentric videos by using them. Therefore, daily activity recognition from egocentric videos is one of the hot topics in computer vision. In this research, we propose a new cooking activity recognition method for egocentric videos. Our proposed method has following characteristic points; 1) hand regions are detected with bounding box by using SSD, 2) hand keypoints(articular points) are estimated by using OpenPose, and hand features are extracted from the keypoint positions and 3) fully-connected multi layer neural network is utilized to recognize cooking activities with the extracted features. From the experimental results with our benchmark including eight cooking activities, we have confirmed that our proposed method allows us to recognize cooking activities with 58.9% on average.

Cite

Text

Okumura et al. "Cooking Activities Recognition in Egocentric Videos Using Hand Shape Feature with Openpose." International Joint Conference on Artificial Intelligence, 2018. doi:10.1145/3230519.3230591

Markdown

[Okumura et al. "Cooking Activities Recognition in Egocentric Videos Using Hand Shape Feature with Openpose." International Joint Conference on Artificial Intelligence, 2018.](https://mlanthology.org/ijcai/2018/okumura2018ijcai-cooking/) doi:10.1145/3230519.3230591

BibTeX

@inproceedings{okumura2018ijcai-cooking,
  title     = {{Cooking Activities Recognition in Egocentric Videos Using Hand Shape Feature with Openpose}},
  author    = {Okumura, Tsukasa and Urabe, Shuichi and Inoue, Katsufumi and Yoshioka, Michifumi},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2018},
  pages     = {42-45},
  doi       = {10.1145/3230519.3230591},
  url       = {https://mlanthology.org/ijcai/2018/okumura2018ijcai-cooking/}
}