An Information Maximization Approach to Overcomplete and Recurrent Representations

Abstract

The principle of maximizing mutual information is applied to learning overcomplete and recurrent representations. The underlying model con(cid:173) sists of a network of input units driving a larger number of output units with recurrent interactions. In the limit of zero noise, the network is de(cid:173) terministic and the mutual information can be related to the entropy of the output units. Maximizing this entropy with respect to both the feed(cid:173) forward connections as well as the recurrent interactions results in simple learning rules for both sets of parameters. The conventional independent components (ICA) learning algorithm can be recovered as a special case where there is an equal number of output units and no recurrent con(cid:173) nections. The application of these new learning rules is illustrated on a simple two-dimensional input example.

Cite

Text

Shriki et al. "An Information Maximization Approach to Overcomplete and Recurrent Representations." Neural Information Processing Systems, 2000.

Markdown

[Shriki et al. "An Information Maximization Approach to Overcomplete and Recurrent Representations." Neural Information Processing Systems, 2000.](https://mlanthology.org/neurips/2000/shriki2000neurips-information/)

BibTeX

@inproceedings{shriki2000neurips-information,
  title     = {{An Information Maximization Approach to Overcomplete and Recurrent Representations}},
  author    = {Shriki, Oren and Sompolinsky, Haim and Lee, Daniel D.},
  booktitle = {Neural Information Processing Systems},
  year      = {2000},
  pages     = {612-618},
  url       = {https://mlanthology.org/neurips/2000/shriki2000neurips-information/}
}