Human Activity Encoding and Recognition Using Low-Level Visual Features
Abstract
Automatic recognition of human activities is among the key capabilities of many intelligent systems with vision/perception. Most existing approaches to this problem require sophisticated feature extraction before classification can be performed. This paper presents a novel approach for human action recognition using only simple low-level visual features: motion captured from direct frame differencing. A codebook of key poses is first created from the training data through unsupervised clustering. Videos of actions are then coded as sequences of super-frames, defined as the key poses augmented with discriminative attributes. A weighted-sequence distance is proposed for comparing two super-frame sequences, which is further wrapped as a kernel embedded in a SVM classifier for the final classification. Compared with conventional methods, our approach provides a flexible non-parametric sequential structure with a corresponding distance measure for human action representation and classification without requiring complex feature extraction. The effectiveness of our approach is demonstrated with the widely-used KTH human activity dataset, for which the proposed method outperforms the existing state-of-the-art. Zheshen Wang, Baoxin Li
Cite
Text
Wang and Li. "Human Activity Encoding and Recognition Using Low-Level Visual Features." International Joint Conference on Artificial Intelligence, 2009.Markdown
[Wang and Li. "Human Activity Encoding and Recognition Using Low-Level Visual Features." International Joint Conference on Artificial Intelligence, 2009.](https://mlanthology.org/ijcai/2009/wang2009ijcai-human/)BibTeX
@inproceedings{wang2009ijcai-human,
title = {{Human Activity Encoding and Recognition Using Low-Level Visual Features}},
author = {Wang, Zheshen and Li, Baoxin},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2009},
pages = {1876-1883},
url = {https://mlanthology.org/ijcai/2009/wang2009ijcai-human/}
}