Reproducing Kernel Banach Space Models for Neural Networks with Application to Rademacher Complexity Analysis
Abstract
This paper explores the use of Hermite transform based reproducing kernel Banach space methods to construct exact or un-approximated models of feedforward neural networks of arbitrary width, depth and topology, including ResNet and Transformers networks, assuming only a feedforward topology, finite energy activations and finite (spectral-) norm weights and biases. Using this model, two straightforward but surprisingly tight bounds on Rademacher complexity are derived, precisely (1) a general bound that is width-independent and scales exponentially with depth; and (2) a width- and depth-independent bound for networks with appropriately constrained (below threshold) weights and biases.
Cite
Text
Shilton et al. "Reproducing Kernel Banach Space Models for Neural Networks with Application to Rademacher Complexity Analysis." Advances in Neural Information Processing Systems, 2025.Markdown
[Shilton et al. "Reproducing Kernel Banach Space Models for Neural Networks with Application to Rademacher Complexity Analysis." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/shilton2025neurips-reproducing/)BibTeX
@inproceedings{shilton2025neurips-reproducing,
title = {{Reproducing Kernel Banach Space Models for Neural Networks with Application to Rademacher Complexity Analysis}},
author = {Shilton, Alistair and Gupta, Sunil and Rana, Santu and Venkatesh, Svetha},
booktitle = {Advances in Neural Information Processing Systems},
year = {2025},
url = {https://mlanthology.org/neurips/2025/shilton2025neurips-reproducing/}
}