Adaptive First-Order Methods Revisited: Convex Minimization Without Lipschitz Requirements

Abstract

We propose a new family of adaptive first-order methods for a class of convex minimization problems that may fail to be Lipschitz continuous or smooth in the standard sense. Specifically, motivated by a recent flurry of activity on non-Lipschitz (NoLips) optimization, we consider problems that are continuous or smooth relative to a reference Bregman function – as opposed to a global, ambient norm (Euclidean or otherwise). These conditions encompass a wide range ofproblems with singular objective, such as Fisher markets, Poisson tomography, D-design, and the like. In this setting, the application of existing order-optimal adaptive methods – like UnixGrad or AcceleGrad – is not possible, especially in the presence of randomness and uncertainty. The proposed method, adaptive mirror descent (AdaMir), aims to close this gap by concurrently achieving min-max optimal rates in problems that are relatively continuous or smooth, including stochastic ones.

Cite

Text

Antonakopoulos and Mertikopoulos. "Adaptive First-Order Methods Revisited: Convex Minimization Without Lipschitz Requirements." Neural Information Processing Systems, 2021.

Markdown

[Antonakopoulos and Mertikopoulos. "Adaptive First-Order Methods Revisited: Convex Minimization Without Lipschitz Requirements." Neural Information Processing Systems, 2021.](https://mlanthology.org/neurips/2021/antonakopoulos2021neurips-adaptive/)

BibTeX

@inproceedings{antonakopoulos2021neurips-adaptive,
  title     = {{Adaptive First-Order Methods Revisited: Convex Minimization Without Lipschitz Requirements}},
  author    = {Antonakopoulos, Kimon and Mertikopoulos, Panayotis},
  booktitle = {Neural Information Processing Systems},
  year      = {2021},
  url       = {https://mlanthology.org/neurips/2021/antonakopoulos2021neurips-adaptive/}
}