Dynamic Cell Structures

Abstract

Dynamic Cell Structures (DCS) represent a family of artificial neural architectures suited both for unsupervised and supervised learning. They belong to the recently [Martinetz94] introduced class of Topology Representing Networks (TRN) which build perlectly topology pre(cid:173) serving feature maps. DCS empI'oy a modified Kohonen learning rule in conjunction with competitive Hebbian learning. The Kohonen type learning rule serves to adjust the synaptic weight vectors while Hebbian learning establishes a dynamic lateral connection structure between the units reflecting the topology of the feature manifold. In case of super(cid:173) vised learning, i.e. function approximation, each neural unit implements a Radial Basis Function, and an additional layer of linear output units adjusts according to a delta-rule. DCS is the first RBF-based approxima(cid:173) tion scheme attempting to concurrently learn and utilize a perfectly to(cid:173) pology preserving map for improved performance. Simulations on a selection of CMU-Benchmarks indicate that the DCS idea applied to the Growing Cell Structure algorithm [Fritzke93] leads to an efficient and elegant algorithm that can beat conventional models on similar tasks.

Cite

Text

Bruske and Sommer. "Dynamic Cell Structures." Neural Information Processing Systems, 1994.

Markdown

[Bruske and Sommer. "Dynamic Cell Structures." Neural Information Processing Systems, 1994.](https://mlanthology.org/neurips/1994/bruske1994neurips-dynamic/)

BibTeX

@inproceedings{bruske1994neurips-dynamic,
  title     = {{Dynamic Cell Structures}},
  author    = {Bruske, Jörg and Sommer, Gerald},
  booktitle = {Neural Information Processing Systems},
  year      = {1994},
  pages     = {497-504},
  url       = {https://mlanthology.org/neurips/1994/bruske1994neurips-dynamic/}
}