Affine-Invariant Global Non-Asymptotic Convergence Analysis of BFGS Under Self-Concordance
Abstract
In this paper, we establish global non-asymptotic convergence guarantees for the BFGS quasi-Newton method without requiring strong convexity or the Lipschitz continuity of the gradient or Hessian. Instead, we consider the setting where the objective function is strictly convex and strongly self-concordant. For an arbitrary initial point and any arbitrary positive-definite initial Hessian approximation, we prove global linear and superlinear convergence guarantees for BFGS when the step size is determined using a line search scheme satisfying the weak Wolfe conditions. Moreover, all our global guarantees are affine-invariant, with the convergence rates depending solely on the initial error and the strongly self-concordant constant. Our results extend the global non-asymptotic convergence theory of BFGS beyond traditional assumptions and, for the first time, establish affine-invariant convergence guarantees—aligning with the inherent affine invariance of the BFGS method.
Cite
Text
Jin and Mokhtari. "Affine-Invariant Global Non-Asymptotic Convergence Analysis of BFGS Under Self-Concordance." Advances in Neural Information Processing Systems, 2025.Markdown
[Jin and Mokhtari. "Affine-Invariant Global Non-Asymptotic Convergence Analysis of BFGS Under Self-Concordance." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/jin2025neurips-affineinvariant/)BibTeX
@inproceedings{jin2025neurips-affineinvariant,
title = {{Affine-Invariant Global Non-Asymptotic Convergence Analysis of BFGS Under Self-Concordance}},
author = {Jin, Qiujiang and Mokhtari, Aryan},
booktitle = {Advances in Neural Information Processing Systems},
year = {2025},
url = {https://mlanthology.org/neurips/2025/jin2025neurips-affineinvariant/}
}