Dynamic Sasvi: Strong Safe Screening for Norm-Regularized Least Squares

Abstract

A recently introduced technique, called safe screening,'' for a sparse optimization problem allows us to identify irrelevant variables in the early stages of optimization. In this paper, we first propose a flexible framework for safe screening based on the Fenchel--Rockafellar duality and then derive a strong safe screening rule for norm-regularized least squares using the proposed framework. We refer to the proposed screening rule for norm-regularized least squares asdynamic Sasvi'' because it can be interpreted as a generalization of Sasvi. Unlike the original Sasvi, it does not require the exact solution of a more strongly regularized problem; hence, it works safely in practice. We show that our screening rule always eliminates more features compared with the existing state-of-the-art methods.

Cite

Text

Yamada and Yamada. "Dynamic Sasvi: Strong Safe Screening for Norm-Regularized Least Squares." Neural Information Processing Systems, 2021.

Markdown

[Yamada and Yamada. "Dynamic Sasvi: Strong Safe Screening for Norm-Regularized Least Squares." Neural Information Processing Systems, 2021.](https://mlanthology.org/neurips/2021/yamada2021neurips-dynamic/)

BibTeX

@inproceedings{yamada2021neurips-dynamic,
  title     = {{Dynamic Sasvi: Strong Safe Screening for Norm-Regularized Least Squares}},
  author    = {Yamada, Hiroaki and Yamada, Makoto},
  booktitle = {Neural Information Processing Systems},
  year      = {2021},
  url       = {https://mlanthology.org/neurips/2021/yamada2021neurips-dynamic/}
}