Emergent Structures and Lifetime Structure Evolution in Artificial Neural Networks
Abstract
Motivated by the flexibility of biological neural networks whose connectivity structure changes significantly during their lifetime,we introduce the Unrestricted Recursive Network (URN) and demonstrate that it can exhibit similar flexibility during training via gradient descent. We show empirically that many of the different neural network structures commonly used in practice today (including fully connected, locally connected and residual networks of differ-ent depths and widths) can emerge dynamically from the same URN.These different structures can be derived using gradient descent on a single general loss function where the structure of the data and the relative strengths of various regulator terms determine the structure of the emergent network. We show that this loss function and the regulators arise naturally when considering the symmetries of the network as well as the geometric properties of the input data.
Cite
Text
Golkar. "Emergent Structures and Lifetime Structure Evolution in Artificial Neural Networks." NeurIPS 2019 Workshops: Neuro_AI, 2019.Markdown
[Golkar. "Emergent Structures and Lifetime Structure Evolution in Artificial Neural Networks." NeurIPS 2019 Workshops: Neuro_AI, 2019.](https://mlanthology.org/neuripsw/2019/golkar2019neuripsw-emergent/)BibTeX
@inproceedings{golkar2019neuripsw-emergent,
title = {{Emergent Structures and Lifetime Structure Evolution in Artificial Neural Networks}},
author = {Golkar, Siavash},
booktitle = {NeurIPS 2019 Workshops: Neuro_AI},
year = {2019},
url = {https://mlanthology.org/neuripsw/2019/golkar2019neuripsw-emergent/}
}