Structured Nonlinear Variable Selection
Abstract
We investigate structured sparsity methods for variable selection in regression problems where the target depends nonlinearly on the inputs. We focus on general nonlinear functions not limiting a priori the function space to additive models. We propose two new regularizers based on partial derivatives as nonlinear equivalents of group lasso and elastic net. We formulate the problem within the framework of learning in reproducing kernel Hilbert spaces and show how the variational problem can be reformulated into a more practical finite dimensional equivalent. We develop a new algorithm derived from the ADMM principles that relies solely on closed forms of the proximal operators. We explore the empirical properties of our new algorithm for Nonlinear Variable Selection based on Derivatives (NVSD) on a set of experiments and confirm favourable properties of our structured-sparsity models and the algorithm in terms of both prediction and variable selection accuracy.
Cite
Text
Gregorova et al. "Structured Nonlinear Variable Selection." Conference on Uncertainty in Artificial Intelligence, 2018.Markdown
[Gregorova et al. "Structured Nonlinear Variable Selection." Conference on Uncertainty in Artificial Intelligence, 2018.](https://mlanthology.org/uai/2018/gregorova2018uai-structured/)BibTeX
@inproceedings{gregorova2018uai-structured,
title = {{Structured Nonlinear Variable Selection}},
author = {Gregorova, Magda and Kalousis, Alexandros and Marchand-Maillet, Stéphane},
booktitle = {Conference on Uncertainty in Artificial Intelligence},
year = {2018},
pages = {23-32},
url = {https://mlanthology.org/uai/2018/gregorova2018uai-structured/}
}