Dual Feature Reduction for the Sparse-Group Lasso and Its Adaptive Variant
Abstract
The sparse-group lasso performs both variable and group selection, simultaneously using the strengths of the lasso and group lasso. It has found widespread use in genetics, a field that regularly involves the analysis of high-dimensional data, due to its sparse-group penalty, which allows it to utilize grouping information. However, the sparse-group lasso can be computationally expensive, due to the added shrinkage complexity, and its additional hyperparameter that needs tuning. This paper presents a novel feature reduction method, Dual Feature Reduction (DFR), that uses strong screening rules for the sparse-group lasso and the adaptive sparse-group lasso to reduce their input space before optimization, without affecting solution optimality. DFR applies two layers of screening through the application of dual norms and subdifferentials. Through synthetic and real data studies, it is shown that DFR drastically reduces the computational cost under many different scenarios.
Cite
Text
Feser and Evangelou. "Dual Feature Reduction for the Sparse-Group Lasso and Its Adaptive Variant." Proceedings of the 42nd International Conference on Machine Learning, 2025.Markdown
[Feser and Evangelou. "Dual Feature Reduction for the Sparse-Group Lasso and Its Adaptive Variant." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/feser2025icml-dual/)BibTeX
@inproceedings{feser2025icml-dual,
title = {{Dual Feature Reduction for the Sparse-Group Lasso and Its Adaptive Variant}},
author = {Feser, Fabio and Evangelou, Marina},
booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
year = {2025},
pages = {17068-17099},
volume = {267},
url = {https://mlanthology.org/icml/2025/feser2025icml-dual/}
}