On the Interplay Between Noise and Curvature and Its Effect on Optimization and Generalization

Abstract

The speed at which one can minimize an expected loss using stochastic methods depends on two properties: the curvature of the loss and the variance of the gradients. While most previous works focus on one or the other of these properties, we explore how their interaction affects optimization speed. Further, as the ultimate goal is good generalization performance, we clarify how both curvature and noise are relevant to properly estimate the generalization gap. Realizing that the limitations of some existing works stems from a confusion between these matrices, we also clarify the distinction between the Fisher matrix, the Hessian, and the covariance matrix of the gradients.

Cite

Text

Thomas et al. "On the Interplay Between Noise and Curvature and Its Effect on Optimization and Generalization." Artificial Intelligence and Statistics, 2020.

Markdown

[Thomas et al. "On the Interplay Between Noise and Curvature and Its Effect on Optimization and Generalization." Artificial Intelligence and Statistics, 2020.](https://mlanthology.org/aistats/2020/thomas2020aistats-interplay/)

BibTeX

@inproceedings{thomas2020aistats-interplay,
  title     = {{On the Interplay Between Noise and Curvature and Its Effect on Optimization and Generalization}},
  author    = {Thomas, Valentin and Pedregosa, Fabian and Merriënboer, Bart and Manzagol, Pierre-Antoine and Bengio, Yoshua and Le Roux, Nicolas},
  booktitle = {Artificial Intelligence and Statistics},
  year      = {2020},
  pages     = {3503-3513},
  volume    = {108},
  url       = {https://mlanthology.org/aistats/2020/thomas2020aistats-interplay/}
}