Almost Multisecant BFGS Quasi-Newton Method

Abstract

Quasi-Newton (QN) methods provide an alternative to second-order techniques for solving minimization problems by approximating curvature. This approach reduces computational complexity as it relies solely on first-order information, and satisfying the secant condition. This paper focuses on multi-secant (MS) extensions of QN for convex optimization problems, which enhances the Hessian approximation at low cost. Specifically, we use a low-rank perturbation strategy to construct an almost-secant QN method that maintains positive definiteness of the Hessian estimate, which in turn helps ensure constant descent (and reduces method divergence). Our results show that careful tuning of the updates greatly improve stability and effectiveness of multisecant updates.

Cite

Text

Lee and Sun. "Almost Multisecant BFGS Quasi-Newton Method." NeurIPS 2023 Workshops: OPT, 2023.

Markdown

[Lee and Sun. "Almost Multisecant BFGS Quasi-Newton Method." NeurIPS 2023 Workshops: OPT, 2023.](https://mlanthology.org/neuripsw/2023/lee2023neuripsw-almost/)

BibTeX

@inproceedings{lee2023neuripsw-almost,
  title     = {{Almost Multisecant BFGS Quasi-Newton Method}},
  author    = {Lee, Mokhwa and Sun, Yifan},
  booktitle = {NeurIPS 2023 Workshops: OPT},
  year      = {2023},
  url       = {https://mlanthology.org/neuripsw/2023/lee2023neuripsw-almost/}
}