Fast Curvature Matrix-Vector Products for Second-Order Gradient Descent

Abstract

We propose a generic method for iteratively approximating various second-order gradient steps—-Newton, Gauss-Newton, Levenberg-Marquardt, and natural gradient—-in linear time per iteration, using special curvature matrix-vector products that can be computed in O(n). Two recent acceleration techniques for on-line learning, matrix momentum and stochastic meta-descent (SMD), implement this approach. Since both were originally derived by very different routes, this offers fresh insight into their operation, resulting in further improvements to SMD.

Cite

Text

Schraudolph. "Fast Curvature Matrix-Vector Products for Second-Order Gradient Descent." Neural Computation, 2002. doi:10.1162/08997660260028683

Markdown

[Schraudolph. "Fast Curvature Matrix-Vector Products for Second-Order Gradient Descent." Neural Computation, 2002.](https://mlanthology.org/neco/2002/schraudolph2002neco-fast/) doi:10.1162/08997660260028683

BibTeX

@article{schraudolph2002neco-fast,
  title     = {{Fast Curvature Matrix-Vector Products for Second-Order Gradient Descent}},
  author    = {Schraudolph, Nicol N.},
  journal   = {Neural Computation},
  year      = {2002},
  pages     = {1723-1738},
  doi       = {10.1162/08997660260028683},
  volume    = {14},
  url       = {https://mlanthology.org/neco/2002/schraudolph2002neco-fast/}
}