The Local Learning Coefficient: A Singularity-Aware Complexity Measure
Abstract
The Local Learning Coefficient (LLC) is introduced as a novel complexity measure for deep neural networks (DNNs). Recognizing the limitations of traditional complexity measures, the LLC leverages Singular Learning Theory (SLT), which has long recognized the significance of singularities in the loss landscape geometry. This paper provides an extensive exploration of the LLC’s theoretical underpinnings, offering both a clear definition and intuitive insights into its application. Moreover, we propose a new scalable estimator for the LLC, which is then effectively applied across diverse architectures including deep linear networks up to 100M parameters, ResNet image models, and transformer language models. Empirical evidence suggests that the LLC provides valuable insights into how training heuristics might influence the effective complexity of DNNs. Ultimately, the LLC emerges as a crucial tool for reconciling the apparent contradiction between deep learning’s complexity and the principle of parsimony.
Cite
Text
Lau et al. "The Local Learning Coefficient: A Singularity-Aware Complexity Measure." Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, 2025.Markdown
[Lau et al. "The Local Learning Coefficient: A Singularity-Aware Complexity Measure." Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, 2025.](https://mlanthology.org/aistats/2025/lau2025aistats-local/)BibTeX
@inproceedings{lau2025aistats-local,
title = {{The Local Learning Coefficient: A Singularity-Aware Complexity Measure}},
author = {Lau, Edmund and Furman, Zach and Wang, George and Murfet, Daniel and Wei, Susan},
booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics},
year = {2025},
pages = {244-252},
volume = {258},
url = {https://mlanthology.org/aistats/2025/lau2025aistats-local/}
}