Unification of Information Maximization and Minimization

Abstract

In the present paper, we propose a method to unify information maximization and minimization in hidden units. The information maximization and minimization are performed on two different lev(cid:173) els: collective and individual level. Thus, two kinds of information: collective and individual information are defined. By maximizing collective information and by minimizing individual information, simple networks can be generated in terms of the number of con(cid:173) nections and the number of hidden units. Obtained networks are expected to give better generalization and improved interpretation of internal representations. This method was applied to the infer(cid:173) ence of the maximum onset principle of an artificial language. In this problem, it was shown that the individual information min(cid:173) imization is not contradictory to the collective information max(cid:173) imization. In addition, experimental results confirmed improved generalization performance, because over-training can significantly be suppressed.

Cite

Text

Kamimura. "Unification of Information Maximization and Minimization." Neural Information Processing Systems, 1996.

Markdown

[Kamimura. "Unification of Information Maximization and Minimization." Neural Information Processing Systems, 1996.](https://mlanthology.org/neurips/1996/kamimura1996neurips-unification/)

BibTeX

@inproceedings{kamimura1996neurips-unification,
  title     = {{Unification of Information Maximization and Minimization}},
  author    = {Kamimura, Ryotaro},
  booktitle = {Neural Information Processing Systems},
  year      = {1996},
  pages     = {508-514},
  url       = {https://mlanthology.org/neurips/1996/kamimura1996neurips-unification/}
}