Greedy Newton: Newton's Method with Exact Line Search

Abstract

A defining characteristic of Newton's method is local superlinear convergence within a neighbourhood of a strict local minimum. However, outside this neighborhood Newton's method can converge slowly or even diverge. A common approach to dealing with non-convergence is using a step size that is set by an Armijo backtracking line search. With suitable initialization the line-search preserves local superlinear convergence, but may give sub-optimal progress when not near a solution. In this work we consider Newton's method under an exact line search, which we call "greedy Newton" (GN). We show that this leads to an improved global convergence rate, while retaining a local superlinear convergence rate. We empirically show that GN may work better than backtracking Newton by allowing significantly larger step sizes.

Cite

Text

Shea and Schmidt. "Greedy Newton: Newton's Method with Exact Line Search." NeurIPS 2023 Workshops: OPT, 2023.

Markdown

[Shea and Schmidt. "Greedy Newton: Newton's Method with Exact Line Search." NeurIPS 2023 Workshops: OPT, 2023.](https://mlanthology.org/neuripsw/2023/shea2023neuripsw-greedy/)

BibTeX

@inproceedings{shea2023neuripsw-greedy,
  title     = {{Greedy Newton: Newton's Method with Exact Line Search}},
  author    = {Shea, Betty and Schmidt, Mark},
  booktitle = {NeurIPS 2023 Workshops: OPT},
  year      = {2023},
  url       = {https://mlanthology.org/neuripsw/2023/shea2023neuripsw-greedy/}
}