Nonparametric Classification on Low Dimensional Manifolds Using Overparameterized Convolutional Residual Networks
Abstract
Convolutional residual neural networks (ConvResNets), though overparametersized, can achieve remarkable prediction performance in practice, which cannot be well explained by conventional wisdom. To bridge this gap, we study the performance of ConvResNeXts trained with weight decay, which cover ConvResNets as a special case, from the perspective of nonparametric classification. Our analysis allows for infinitely many building blocks in ConvResNeXts, and shows that weight decay implicitly enforces sparsity on these blocks. Specifically, we consider a smooth target function supported on a low-dimensional manifold, then prove that ConvResNeXts can adapt to the function smoothness and low-dimensional structures and efficiently learn the function without suffering from the curse of dimensionality. Our findings partially justify the advantage of overparameterized ConvResNeXts over conventional machine learning models.
Cite
Text
Zhang et al. "Nonparametric Classification on Low Dimensional Manifolds Using Overparameterized Convolutional Residual Networks." Neural Information Processing Systems, 2024. doi:10.52202/079017-2101Markdown
[Zhang et al. "Nonparametric Classification on Low Dimensional Manifolds Using Overparameterized Convolutional Residual Networks." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/zhang2024neurips-nonparametric/) doi:10.52202/079017-2101BibTeX
@inproceedings{zhang2024neurips-nonparametric,
title = {{Nonparametric Classification on Low Dimensional Manifolds Using Overparameterized Convolutional Residual Networks}},
author = {Zhang, Zixuan and Zhang, Kaiqi and Chen, Minshuo and Takeda, Yuma and Wang, Mengdi and Zhao, Tuo and Wang, Yu-Xiang},
booktitle = {Neural Information Processing Systems},
year = {2024},
doi = {10.52202/079017-2101},
url = {https://mlanthology.org/neurips/2024/zhang2024neurips-nonparametric/}
}