Trajectory Alignment: Understanding the Edge of Stability Phenomenon via Bifurcation Theory

Abstract

Cohen et al. (2021) empirically study the evolution of the largest eigenvalue of the loss Hessian, also known as sharpness, along the gradient descent (GD) trajectory and observe the Edge of Stability (EoS) phenomenon. The sharpness increases at the early phase of training (referred to as progressive sharpening), and eventually saturates close to the threshold of $2 / \text{(step size)}$. In this paper, we start by demonstrating through empirical studies that when the EoS phenomenon occurs, different GD trajectories (after a proper reparameterization) align on a specific bifurcation diagram independent of initialization. We then rigorously prove this trajectory alignment phenomenon for a two-layer fully-connected linear network and a single-neuron nonlinear network trained with a single data point. Our trajectory alignment analysis establishes both progressive sharpening and EoS phenomena, encompassing and extending recent findings in the literature.

Cite

Text

Song and Yun. "Trajectory Alignment: Understanding the Edge of Stability Phenomenon via Bifurcation Theory." Neural Information Processing Systems, 2023.

Markdown

[Song and Yun. "Trajectory Alignment: Understanding the Edge of Stability Phenomenon via Bifurcation Theory." Neural Information Processing Systems, 2023.](https://mlanthology.org/neurips/2023/song2023neurips-trajectory/)

BibTeX

@inproceedings{song2023neurips-trajectory,
  title     = {{Trajectory Alignment: Understanding the Edge of Stability Phenomenon via Bifurcation Theory}},
  author    = {Song, Minhak and Yun, Chulhee},
  booktitle = {Neural Information Processing Systems},
  year      = {2023},
  url       = {https://mlanthology.org/neurips/2023/song2023neurips-trajectory/}
}