Mind the Pool: Convolutional Neural Networks Can Overfit Input Size
Abstract
We demonstrate how convolutional neural networks can overfit the input size: The accuracy drops significantly when using certain sizes, compared with favorable ones. This issue is inherent to pooling arithmetic, with standard downsampling layers playing a major role in favoring certain input sizes and skewing the weights accordingly. We present a solution to this problem by depriving these layers from the arithmetic cues they use to overfit the input size. Through various examples, we show how our proposed spatially-balanced pooling improves the generalization of the network to arbitrary input sizes and its robustness to translational shifts.
Cite
Text
Alsallakh et al. "Mind the Pool: Convolutional Neural Networks Can Overfit Input Size." International Conference on Learning Representations, 2023.Markdown
[Alsallakh et al. "Mind the Pool: Convolutional Neural Networks Can Overfit Input Size." International Conference on Learning Representations, 2023.](https://mlanthology.org/iclr/2023/alsallakh2023iclr-mind/)BibTeX
@inproceedings{alsallakh2023iclr-mind,
title = {{Mind the Pool: Convolutional Neural Networks Can Overfit Input Size}},
author = {Alsallakh, Bilal and Yan, David and Kokhlikyan, Narine and Miglani, Vivek and Reblitz-Richardson, Orion and Bhattacharya, Pamela},
booktitle = {International Conference on Learning Representations},
year = {2023},
url = {https://mlanthology.org/iclr/2023/alsallakh2023iclr-mind/}
}