Towards Understanding Normalization in Neural ODEs
Abstract
Normalization is an important and vastly investigated technique in deep learning. However, its role for Ordinary Differential Equation based networks (Neural ODEs) is still poorly understood. This paper investigates how different normalization techniques affect the performance of Neural ODEs. Particularly, we show that it is possible to achieve $93\%$ accuracy on the CIFAR-10 classification task, and to the best of our knowledge, this is the highest reported accuracy among Neural ODEs tested on this problem.
Cite
Text
Gusak et al. "Towards Understanding Normalization in Neural ODEs." ICLR 2020 Workshops: DeepDiffEq, 2020.Markdown
[Gusak et al. "Towards Understanding Normalization in Neural ODEs." ICLR 2020 Workshops: DeepDiffEq, 2020.](https://mlanthology.org/iclrw/2020/gusak2020iclrw-understanding/)BibTeX
@inproceedings{gusak2020iclrw-understanding,
title = {{Towards Understanding Normalization in Neural ODEs}},
author = {Gusak, Julia and Markeeva, Larisa and Daulbaev, Talgat and Katrutsa, Alexander and Cichocki, Andrzej and Oseledets, Ivan},
booktitle = {ICLR 2020 Workshops: DeepDiffEq},
year = {2020},
url = {https://mlanthology.org/iclrw/2020/gusak2020iclrw-understanding/}
}