Action2Activity: Recognizing Complex Activities from Sensor Data

Abstract

As compared to simple actions, activities are much more complex, but semantically consistent with a human's real life. Techniques for action recognition from sensor generated data are mature. However, there has been relatively little work on bridging the gap between actions and activities. To this end, this paper presents a novel approach for complex activity recognition comprising of two components. The first component is temporal pattern mining, which provides a mid-level feature representation for activities, encodes temporal relatedness among actions, and captures the intrinsic properties of activities. The second component is adaptive Multi-Task Learning, which captures relatedness among activities and selects discriminant features. Extensive experiments on a real-world dataset demonstrate the effectiveness of our work.

Cite

Text

Liu et al. "Action2Activity: Recognizing Complex Activities from Sensor Data." International Joint Conference on Artificial Intelligence, 2015.

Markdown

[Liu et al. "Action2Activity: Recognizing Complex Activities from Sensor Data." International Joint Conference on Artificial Intelligence, 2015.](https://mlanthology.org/ijcai/2015/liu2015ijcai-action/)

BibTeX

@inproceedings{liu2015ijcai-action,
  title     = {{Action2Activity: Recognizing Complex Activities from Sensor Data}},
  author    = {Liu, Ye and Nie, Liqiang and Han, Lei and Zhang, Luming and Rosenblum, David S.},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2015},
  pages     = {1617-1623},
  url       = {https://mlanthology.org/ijcai/2015/liu2015ijcai-action/}
}