Splitting Steepest Descent for Growing Neural Architectures

Abstract

We develop a progressive training approach for neural networks which adaptively grows the network structure by splitting existing neurons to multiple off-springs. By leveraging a functional steepest descent idea, we derive a simple criterion for deciding the best subset of neurons to split and a \emph{splitting gradient} for optimally updating the off-springs. Theoretically, our splitting strategy is a second order functional steepest descent for escaping saddle points in an $\Linfty$-Wasserstein metric space, on which the standard parametric gradient descent is a first-order steepest descent. Our method provides a new computationally efficient approach for optimizing neural network structures, especially for learning lightweight neural architectures in resource-constrained settings.

Cite

Text

Wu et al. "Splitting Steepest Descent for Growing Neural Architectures." Neural Information Processing Systems, 2019.

Markdown

[Wu et al. "Splitting Steepest Descent for Growing Neural Architectures." Neural Information Processing Systems, 2019.](https://mlanthology.org/neurips/2019/wu2019neurips-splitting/)

BibTeX

@inproceedings{wu2019neurips-splitting,
  title     = {{Splitting Steepest Descent for Growing Neural Architectures}},
  author    = {Wu, Lemeng and Wang, Dilin and Liu, Qiang},
  booktitle = {Neural Information Processing Systems},
  year      = {2019},
  pages     = {10656-10666},
  url       = {https://mlanthology.org/neurips/2019/wu2019neurips-splitting/}
}