Bounds for the Smallest Eigenvalue of the NTK for Arbitrary Spherical Data of Arbitrary Dimension
Abstract
Bounds on the smallest eigenvalue of the neural tangent kernel (NTK) are a key ingredient in the analysis of neural network optimization and memorization. However, existing results require distributional assumptions on the data and are limited to a high-dimensional setting, where the input dimension $d_0$ scales at least logarithmically in the number of samples $n$. In this work we remove both of these requirements and instead provide bounds in terms of a measure of distance between data points: notably these bounds hold with high probability even when $d_0$ is held constant versus $n$. We prove our results through a novel application of the hemisphere transform.
Cite
Text
Karhadkar et al. "Bounds for the Smallest Eigenvalue of the NTK for Arbitrary Spherical Data of Arbitrary Dimension." Neural Information Processing Systems, 2024. doi:10.52202/079017-4387Markdown
[Karhadkar et al. "Bounds for the Smallest Eigenvalue of the NTK for Arbitrary Spherical Data of Arbitrary Dimension." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/karhadkar2024neurips-bounds/) doi:10.52202/079017-4387BibTeX
@inproceedings{karhadkar2024neurips-bounds,
title = {{Bounds for the Smallest Eigenvalue of the NTK for Arbitrary Spherical Data of Arbitrary Dimension}},
author = {Karhadkar, Kedar and Murray, Michael and Montúfar, Guido},
booktitle = {Neural Information Processing Systems},
year = {2024},
doi = {10.52202/079017-4387},
url = {https://mlanthology.org/neurips/2024/karhadkar2024neurips-bounds/}
}