Global Linear and Local Superlinear Convergence of IRLS for Non-Smooth Robust Regression

Abstract

We advance both the theory and practice of robust $\ell_p$-quasinorm regression for $p \in (0,1]$ by using novel variants of iteratively reweighted least-squares (IRLS) to solve the underlying non-smooth problem. In the convex case, $p=1$, we prove that this IRLS variant converges globally at a linear rate under a mild, deterministic condition on the feature matrix called the stable range space property. In the non-convex case, $p\in(0,1)$, we prove that under a similar condition, IRLS converges locally to the global minimizer at a superlinear rate of order $2-p$; the rate becomes quadratic as $p\to 0$. We showcase the proposed methods in three applications: real phase retrieval, regression without correspondences, and robust face restoration. The results show that (1) IRLS can handle a larger number of outliers than other methods, (2) it is faster than competing methods at the same level of accuracy, (3) it restores a sparsely corrupted face image with satisfactory visual quality.

Cite

Text

Peng et al. "Global Linear and Local Superlinear Convergence of IRLS for Non-Smooth Robust Regression." Neural Information Processing Systems, 2022.

Markdown

[Peng et al. "Global Linear and Local Superlinear Convergence of IRLS for Non-Smooth Robust Regression." Neural Information Processing Systems, 2022.](https://mlanthology.org/neurips/2022/peng2022neurips-global/)

BibTeX

@inproceedings{peng2022neurips-global,
  title     = {{Global Linear and Local Superlinear Convergence of IRLS for Non-Smooth Robust Regression}},
  author    = {Peng, Liangzu and Kümmerle, Christian and Vidal, Rene},
  booktitle = {Neural Information Processing Systems},
  year      = {2022},
  url       = {https://mlanthology.org/neurips/2022/peng2022neurips-global/}
}