Tensor Canonical Correlation Analysis for Action Classification

Abstract

We introduce a new framework, namely tensor canonical correlation analysis (TCCA) which is an extension of classical canonical correlation analysis (CCA) to multidimensional data arrays (or tensors) and apply this for action/gesture classification in videos. By tensor CCA, joint space-time linear relationships of two video volumes are inspected to yield flexible and descriptive similarity features of the two videos. The TCCA features are combined with a discriminative feature selection scheme and a nearest neighbor classifier for action classification. In addition, we propose a time-efficient action detection method based on dynamic learning of subspaces for tensor CCA for the case that actions are not aligned in the space-time domain. The proposed method delivered significantly better accuracy and comparable detection speed over state-of-the-art methods on the KTH action data set as well as self-recorded hand gesture data sets.

Cite

Text

Kim et al. "Tensor Canonical Correlation Analysis for Action Classification." IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2007. doi:10.1109/CVPR.2007.383137

Markdown

[Kim et al. "Tensor Canonical Correlation Analysis for Action Classification." IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2007.](https://mlanthology.org/cvpr/2007/kim2007cvpr-tensor/) doi:10.1109/CVPR.2007.383137

BibTeX

@inproceedings{kim2007cvpr-tensor,
  title     = {{Tensor Canonical Correlation Analysis for Action Classification}},
  author    = {Kim, Tae-Kyun and Wong, Shu-Fai and Cipolla, Roberto},
  booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  year      = {2007},
  doi       = {10.1109/CVPR.2007.383137},
  url       = {https://mlanthology.org/cvpr/2007/kim2007cvpr-tensor/}
}