Post-Selection Inference with HSIC-Lasso
Abstract
Detecting influential features in non-linear and/or high-dimensional data is a challenging and increasingly important task in machine learning. Variable selection methods have thus been gaining much attention as well as post-selection inference. Indeed, the selected features can be significantly flawed when the selection procedure is not accounted for. We propose a selective inference procedure using the so-called model-free "HSIC-Lasso" based on the framework of truncated Gaussians combined with the polyhedral lemma. We then develop an algorithm, which allows for low computational costs and provides a selection of the regularisation parameter. The performance of our method is illustrated by both artificial and real-world data based experiments, which emphasise a tight control of the type-I error, even for small sample sizes.
Cite
Text
Freidling et al. "Post-Selection Inference with HSIC-Lasso." International Conference on Machine Learning, 2021.Markdown
[Freidling et al. "Post-Selection Inference with HSIC-Lasso." International Conference on Machine Learning, 2021.](https://mlanthology.org/icml/2021/freidling2021icml-postselection/)BibTeX
@inproceedings{freidling2021icml-postselection,
title = {{Post-Selection Inference with HSIC-Lasso}},
author = {Freidling, Tobias and Poignard, Benjamin and Climente-González, Héctor and Yamada, Makoto},
booktitle = {International Conference on Machine Learning},
year = {2021},
pages = {3439-3448},
volume = {139},
url = {https://mlanthology.org/icml/2021/freidling2021icml-postselection/}
}