On Improving the Numerical Stability of Winograd Convolutions
Abstract
Deep convolutional neural networks rely on heavily optimized convolution algorithms. Winograd convolutions provide an efficient approach to performing such convolutions. Using larger Winograd convolution tiles, the convolution will become more efficient but less numerically accurate. Here we provide some approaches to mitigating this numerical inaccuracy. We will exemplify these approaches by working on a tile much larger than any previously documented: F(9x9, 5x5). Using these approaches, we will show that such a tile can be used to train modern networks and provide performance benefits.
Cite
Text
Vincent et al. "On Improving the Numerical Stability of Winograd Convolutions." International Conference on Learning Representations, 2017.Markdown
[Vincent et al. "On Improving the Numerical Stability of Winograd Convolutions." International Conference on Learning Representations, 2017.](https://mlanthology.org/iclr/2017/vincent2017iclr-improving/)BibTeX
@inproceedings{vincent2017iclr-improving,
title = {{On Improving the Numerical Stability of Winograd Convolutions}},
author = {Vincent, Kevin and Stephano, Kevin and Frumkin, Michael A. and Ginsburg, Boris and Demouth, Julien},
booktitle = {International Conference on Learning Representations},
year = {2017},
url = {https://mlanthology.org/iclr/2017/vincent2017iclr-improving/}
}