Nonnegative Sparse PCA
Abstract
We describe a nonnegative variant of the "Sparse PCA" problem. The goal is to create a low dimensional representation from a collection of points which on the one hand maximizes the variance of the projected points and on the other uses only parts of the original coordinates, and thereby creating a sparse representation. What distinguishes our problem from other Sparse PCA formulations is that the projection involves only nonnegative weights of the original coordinates -- a desired quality in various fields, including economics, bioinformatics and computer vision. Adding nonnegativity contributes to sparseness, where it enforces a partitioning of the original coordinates among the new axes. We describe a simple yet efficient iterative coordinate-descent type of scheme which converges to a local optimum of our optimization criteria, giving good results on large real world datasets.
Cite
Text
Zass and Shashua. "Nonnegative Sparse PCA." Neural Information Processing Systems, 2006.Markdown
[Zass and Shashua. "Nonnegative Sparse PCA." Neural Information Processing Systems, 2006.](https://mlanthology.org/neurips/2006/zass2006neurips-nonnegative/)BibTeX
@inproceedings{zass2006neurips-nonnegative,
title = {{Nonnegative Sparse PCA}},
author = {Zass, Ron and Shashua, Amnon},
booktitle = {Neural Information Processing Systems},
year = {2006},
pages = {1561-1568},
url = {https://mlanthology.org/neurips/2006/zass2006neurips-nonnegative/}
}