R1-PCA: Rotational Invariant L1-Norm Principal Component Analysis for Robust Subspace Factorization
Abstract
Principal component analysis (PCA) minimizes the sum of squared errors (L2-norm) and is sensitive to the presence of outliers. We propose a rotational invariant L1-norm PCA (R1-PCA). R1-PCA is similar to PCA in that (1) it has a unique global solution, (2) the solution are principal eigenvectors of a robust covariance matrix (re-weighted to soften the effects of outliers), (3) the solution is rotational invariant. These properties are not shared by the L1-norm PCA. A new subspace iteration algorithm is given to compute R1-PCA efficiently. Experiments on several real-life datasets show R1-PCA can effectively handle outliers. We extend R1-norm to K-means clustering and show that L1-norm K-means leads to poor results while R1-K-means outperforms standard K-means.
Cite
Text
Ding et al. "R1-PCA: Rotational Invariant L1-Norm Principal Component Analysis for Robust Subspace Factorization." International Conference on Machine Learning, 2006. doi:10.1145/1143844.1143880Markdown
[Ding et al. "R1-PCA: Rotational Invariant L1-Norm Principal Component Analysis for Robust Subspace Factorization." International Conference on Machine Learning, 2006.](https://mlanthology.org/icml/2006/ding2006icml-r/) doi:10.1145/1143844.1143880BibTeX
@inproceedings{ding2006icml-r,
title = {{R1-PCA: Rotational Invariant L1-Norm Principal Component Analysis for Robust Subspace Factorization}},
author = {Ding, Chris H. Q. and Zhou, Ding and He, Xiaofeng and Zha, Hongyuan},
booktitle = {International Conference on Machine Learning},
year = {2006},
pages = {281-288},
doi = {10.1145/1143844.1143880},
url = {https://mlanthology.org/icml/2006/ding2006icml-r/}
}