Distributionally Robust Linear Regression with Block Lewis Weights
Abstract
We present an algorithm for the empirical group distributionally robust (GDR) least squares problem. Given $m$ groups, a parameter vector in $\mathbb{R}^d$, and stacked design matrices and responses $\mathbf{A}$ and $\mathbf{b}$, our algorithm obtains a $(1+\varepsilon)$-multiplicative optimal solution using $\widetilde{O}(\min(\mathsf{rank}(\mathbf{A}),m)^{1/3}\varepsilon^{-2/3})$ linear system solves of matrices of the form $\mathbf{A}^{\top}\mathbf{B}\mathbf{A}$ for block-diagonal $\mathbf{B}$. Our technical methods follow from a recent technique that relates the empirical GDR problem to a carefully chosen least squares problem and an application of ball-oracle acceleration. For moderate accuracy regimes, our algorithm improves over all known interior point methods and matches the state-of-the-art guarantees for the special case of $\ell_{\infty}$ regression.
Cite
Text
Manoj and Patel. "Distributionally Robust Linear Regression with Block Lewis Weights." NeurIPS 2024 Workshops: OPT, 2024.Markdown
[Manoj and Patel. "Distributionally Robust Linear Regression with Block Lewis Weights." NeurIPS 2024 Workshops: OPT, 2024.](https://mlanthology.org/neuripsw/2024/manoj2024neuripsw-distributionally/)BibTeX
@inproceedings{manoj2024neuripsw-distributionally,
title = {{Distributionally Robust Linear Regression with Block Lewis Weights}},
author = {Manoj, Naren Sarayu and Patel, Kumar Kshitij},
booktitle = {NeurIPS 2024 Workshops: OPT},
year = {2024},
url = {https://mlanthology.org/neuripsw/2024/manoj2024neuripsw-distributionally/}
}