On the Geometry and Optimization of Polynomial Convolutional Networks
Abstract
We study convolutional neural networks with monomial activation functions. Specifically, we prove that their parameterization map is regular and is an isomorphism almost everywhere, up to rescaling the filters. By leveraging on tools from algebraic geometry, we explore the geometric properties of the image in function space of this map – typically referred to as neuromanifold. In particular, we compute the dimension and the degree of the neuromanifold, which measure the expressivity of the model, and describe its singularities. Moreover, for a generic large dataset, we derive an explicit formula that quantifies the number of critical points arising in the optimization of a regression loss.
Cite
Text
Shahverdi et al. "On the Geometry and Optimization of Polynomial Convolutional Networks." Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, 2025.Markdown
[Shahverdi et al. "On the Geometry and Optimization of Polynomial Convolutional Networks." Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, 2025.](https://mlanthology.org/aistats/2025/shahverdi2025aistats-geometry/)BibTeX
@inproceedings{shahverdi2025aistats-geometry,
title = {{On the Geometry and Optimization of Polynomial Convolutional Networks}},
author = {Shahverdi, Vahid and Marchetti, Giovanni Luca and Kohn, Kathlén},
booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics},
year = {2025},
pages = {604-612},
volume = {258},
url = {https://mlanthology.org/aistats/2025/shahverdi2025aistats-geometry/}
}