Scaling Laws and Local Minima in Hebbian ICA

Abstract

We study the dynamics of a Hebbian ICA algorithm extracting a sin- gle non-Gaussian component from a high-dimensional Gaussian back- ground. For both on-line and batch learning we find that a surprisingly large number of examples are required to avoid trapping in a sub-optimal state close to the initial conditions. To extract a skewed signal at least examples are required for -dimensional data and

Cite

Text

Rattray and Basalyga. "Scaling Laws and Local Minima in Hebbian ICA." Neural Information Processing Systems, 2001.

Markdown

[Rattray and Basalyga. "Scaling Laws and Local Minima in Hebbian ICA." Neural Information Processing Systems, 2001.](https://mlanthology.org/neurips/2001/rattray2001neurips-scaling/)

BibTeX

@inproceedings{rattray2001neurips-scaling,
  title     = {{Scaling Laws and Local Minima in Hebbian ICA}},
  author    = {Rattray, Magnus and Basalyga, Gleb},
  booktitle = {Neural Information Processing Systems},
  year      = {2001},
  pages     = {495-501},
  url       = {https://mlanthology.org/neurips/2001/rattray2001neurips-scaling/}
}