Uncorrelated Multilinear Principal Component Analysis Through Successive Variance Maximization
Abstract
Tensorial data are frequently encountered in various machine learning tasks today and dimensionality reduction is one of their most important applications. This paper extends the classical principal component analysis (PCA) to its multilinear version by proposing a novel dimensionality reduction algorithm for tensorial data, named as uncorrelated multilinear PCA (UMPCA). UMPCA seeks a tensor-to-vector projection that captures most of the variation in the original tensorial input while producing uncorrelated features through successive variance maximization. We evaluate the proposed algorithm on a second-order tensorial problem, face recognition, and the experimental results show its superiority, especially in low-dimensional spaces, through the comparison with three other PCA-based algorithms.
Cite
Text
Lu et al. "Uncorrelated Multilinear Principal Component Analysis Through Successive Variance Maximization." International Conference on Machine Learning, 2008. doi:10.1145/1390156.1390234Markdown
[Lu et al. "Uncorrelated Multilinear Principal Component Analysis Through Successive Variance Maximization." International Conference on Machine Learning, 2008.](https://mlanthology.org/icml/2008/lu2008icml-uncorrelated/) doi:10.1145/1390156.1390234BibTeX
@inproceedings{lu2008icml-uncorrelated,
title = {{Uncorrelated Multilinear Principal Component Analysis Through Successive Variance Maximization}},
author = {Lu, Haiping and Plataniotis, Konstantinos N. and Venetsanopoulos, Anastasios N.},
booktitle = {International Conference on Machine Learning},
year = {2008},
pages = {616-623},
doi = {10.1145/1390156.1390234},
url = {https://mlanthology.org/icml/2008/lu2008icml-uncorrelated/}
}