Semi-Flat Minima and Saddle Points by Embedding Neural Networks to Overparameterization
Abstract
We theoretically study the landscape of the training error for neural networks in overparameterized cases. We consider three basic methods for embedding a network into a wider one with more hidden units, and discuss whether a minimum point of the narrower network gives a minimum or saddle point of the wider one. Our results show that the networks with smooth and ReLU activation have different partially flat landscapes around the embedded point. We also relate these results to a difference of their generalization abilities in overparameterized realization.
Cite
Text
Fukumizu et al. "Semi-Flat Minima and Saddle Points by Embedding Neural Networks to Overparameterization." Neural Information Processing Systems, 2019.Markdown
[Fukumizu et al. "Semi-Flat Minima and Saddle Points by Embedding Neural Networks to Overparameterization." Neural Information Processing Systems, 2019.](https://mlanthology.org/neurips/2019/fukumizu2019neurips-semiflat/)BibTeX
@inproceedings{fukumizu2019neurips-semiflat,
title = {{Semi-Flat Minima and Saddle Points by Embedding Neural Networks to Overparameterization}},
author = {Fukumizu, Kenji and Yamaguchi, Shoichiro and Mototake, Yoh-ichi and Tanaka, Mirai},
booktitle = {Neural Information Processing Systems},
year = {2019},
pages = {13868-13876},
url = {https://mlanthology.org/neurips/2019/fukumizu2019neurips-semiflat/}
}