Topological Neural Discrete Representation Learning À La Kohonen

Abstract

Unsupervised learning of discrete representations in neural networks (NNs) from continuous ones is essential for many modern applications. Vector Quantisation (VQ) has become popular for this, in particular in the context of generative models such as Variational Auto-Encoders (VAEs), where the exponential moving average-based VQ (EMA-VQ) algorithm is often used. Here we study an alternative VQ algorithm based on Kohonen's learning rule for the Self-Organising Map (KSOM; 1982), a classic VQ algorithm known to offer two potential benefits over its special case EMA-VQ: empirically, KSOM converges faster than EMA-VQ, and KSOM-generated discrete representations form a topological structure on the grid whose nodes are the discrete symbols, resulting in an artificial version of the brain's topographic map. We revisit these properties by using KSOM in VQ-VAEs for image processing. In our experiments, the speed-up compared to well-configured EMA-VQ is only observable at the beginning of training, but KSOM is generally much more robust, e.g., w.r.t. the choice of initialisation schemes.

Cite

Text

Irie et al. "Topological Neural Discrete Representation Learning À La Kohonen." ICML 2023 Workshops: SODS, 2023.

Markdown

[Irie et al. "Topological Neural Discrete Representation Learning À La Kohonen." ICML 2023 Workshops: SODS, 2023.](https://mlanthology.org/icmlw/2023/irie2023icmlw-topological/)

BibTeX

@inproceedings{irie2023icmlw-topological,
  title     = {{Topological Neural Discrete Representation Learning À La Kohonen}},
  author    = {Irie, Kazuki and Csordás, Róbert and Schmidhuber, Jürgen},
  booktitle = {ICML 2023 Workshops: SODS},
  year      = {2023},
  url       = {https://mlanthology.org/icmlw/2023/irie2023icmlw-topological/}
}