Augmenting Supervised Learning by Meta-Learning Unsupervised Local Rules

Abstract

The brain performs unsupervised learning and (perhaps) simultaneous supervised learning. This raises the question as to whether a hybrid of supervised and unsupervised methods will produce better learning. Inspired by the rich space of Hebbian learning rules, we set out to directly learn the unsupervised learning rule on local information that best augments a supervised signal. We present the Hebbian-augmented training algorithm (HAT) for combining gradient-based learning with an unsupervised rule on pre-synpatic activity, post-synaptic activities, and current weights. We test HAT's effect on a simple problem (Fashion-MNIST) and find consistently higher performance than supervised learning alone. This finding provides empirical evidence that unsupervised learning on synaptic activities provides a strong signal that can be used to augment gradient-based methods. We further find that the meta-learned update rule is a time-varying function; thus, it is difficult to pinpoint an interpretable Hebbian update rule that aids in training. We do find that the meta-learner eventually degenerates into a non-Hebbian rule that preserves important weights so as not to disturb the learner's convergence.

Cite

Text

Cheng et al. "Augmenting Supervised Learning by Meta-Learning Unsupervised Local Rules." NeurIPS 2019 Workshops: Neuro_AI, 2019.

Markdown

[Cheng et al. "Augmenting Supervised Learning by Meta-Learning Unsupervised Local Rules." NeurIPS 2019 Workshops: Neuro_AI, 2019.](https://mlanthology.org/neuripsw/2019/cheng2019neuripsw-augmenting/)

BibTeX

@inproceedings{cheng2019neuripsw-augmenting,
  title     = {{Augmenting Supervised Learning by Meta-Learning Unsupervised Local Rules}},
  author    = {Cheng, Jeffrey Siedar and Benjamin, Ari and Lansdell, Benjamin and Kording, Konrad Paul},
  booktitle = {NeurIPS 2019 Workshops: Neuro_AI},
  year      = {2019},
  url       = {https://mlanthology.org/neuripsw/2019/cheng2019neuripsw-augmenting/}
}