Super Level Sets and Exponential Decay: A Synergistic Approach to Stable Neural Network Training
Abstract
This paper presents a theoretically grounded optimization framework for neural network training that integrates an Exponentially Decaying Learning Rate with Lyapunov-based stability analysis. We develop a dynamic learning rate algorithm and prove that it induces connected and stable descent paths through the loss landscape by maintaining the connectivity of super-level sets Sλ = θ ∈ ℝn : ℒ(θ) ≥ λ. Under the condition that the Lyapunov function V(θ) = ℒ(θ) satisfies Δ V(θ) ⋅ Δ ℒ(θ) ≥ 0, we establish that these super-level sets are not only connected but also equiconnected across epochs, providing uniform topological stability. We further derive convergence guarantees using a second-order Taylor expansion and demonstrate that our exponentially scheduled learning rate with gradient-based modulation leads to a monotonic decrease in loss. The proposed algorithm incorporates this schedule into a stability-aware update mechanism that adapts step sizes based on both curvature and energy-level geometry. This work formalizes the role of topological structure in convergence dynamics and introduces a provably stable optimization algorithm for high-dimensional, non-convex neural networks.
Cite
Text
Chaudhary et al. "Super Level Sets and Exponential Decay: A Synergistic Approach to Stable Neural Network Training." Journal of Artificial Intelligence Research, 2025. doi:10.1613/JAIR.1.17272Markdown
[Chaudhary et al. "Super Level Sets and Exponential Decay: A Synergistic Approach to Stable Neural Network Training." Journal of Artificial Intelligence Research, 2025.](https://mlanthology.org/jair/2025/chaudhary2025jair-super/) doi:10.1613/JAIR.1.17272BibTeX
@article{chaudhary2025jair-super,
title = {{Super Level Sets and Exponential Decay: A Synergistic Approach to Stable Neural Network Training}},
author = {Chaudhary, Jatin Kumar and Nidhi, Dipak Kumar and Heikkonen, Jukka and Merisaari, Harri and Kanth, Rajiv},
journal = {Journal of Artificial Intelligence Research},
year = {2025},
doi = {10.1613/JAIR.1.17272},
volume = {83},
url = {https://mlanthology.org/jair/2025/chaudhary2025jair-super/}
}