Multitask Principal Component Analysis
Abstract
Principal Component Analysis (PCA) is a canonical and well-studied tool for dimensionality reduction. However, when few data are available, the poor quality of the covariance estimator at its core may compromise its performance. We leverage this issue by casting the PCA into a multitask framework, and doing so, we show how to solve simultaneously several related PCA problems. Hence, we propose a novel formulation of the PCA problem relying on a novel regularization. This regularization is based on a distance between subspaces, and the whole problem is solved as an optimization problem over a Riemannian manifold. We experimentally demonstrate the usefulness of our approach as pre-processing for EEG signals.
Cite
Text
Yamane et al. "Multitask Principal Component Analysis." Proceedings of The 8th Asian Conference on Machine Learning, 2016.Markdown
[Yamane et al. "Multitask Principal Component Analysis." Proceedings of The 8th Asian Conference on Machine Learning, 2016.](https://mlanthology.org/acml/2016/yamane2016acml-multitask/)BibTeX
@inproceedings{yamane2016acml-multitask,
title = {{Multitask Principal Component Analysis}},
author = {Yamane, Ikko and Yger, Florian and Berar, Maxime and Sugiyama, Masashi},
booktitle = {Proceedings of The 8th Asian Conference on Machine Learning},
year = {2016},
pages = {302-317},
volume = {63},
url = {https://mlanthology.org/acml/2016/yamane2016acml-multitask/}
}