Error-Based Knockoffs Inference for Controlled Feature Selection
Abstract
Recently, the scheme of model-X knockoffs was proposed as a promising solution to address controlled feature selection under high-dimensional finite-sample settings. However, the procedure of model-X knockoffs depends heavily on the coefficient-based feature importance and only concerns the control of false discovery rate (FDR). To further improve its adaptivity and flexibility, in this paper, we propose an error-based knockoff inference method by integrating the knockoff features, the error-based feature importance statistics, and the stepdown procedure together. The proposed inference procedure does not require specifying a regression model and can handle feature selection with theoretical guarantees on controlling false discovery proportion (FDP), FDR, or k-familywise error rate (k-FWER). Empirical evaluations demonstrate the competitive performance of our approach on both simulated and real data.
Cite
Text
Zhao et al. "Error-Based Knockoffs Inference for Controlled Feature Selection." AAAI Conference on Artificial Intelligence, 2022. doi:10.1609/AAAI.V36I8.20905Markdown
[Zhao et al. "Error-Based Knockoffs Inference for Controlled Feature Selection." AAAI Conference on Artificial Intelligence, 2022.](https://mlanthology.org/aaai/2022/zhao2022aaai-error/) doi:10.1609/AAAI.V36I8.20905BibTeX
@inproceedings{zhao2022aaai-error,
title = {{Error-Based Knockoffs Inference for Controlled Feature Selection}},
author = {Zhao, Xuebin and Chen, Hong and Wang, Yingjie and Li, Weifu and Gong, Tieliang and Wang, Yulong and Zheng, Feng},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2022},
pages = {9190-9198},
doi = {10.1609/AAAI.V36I8.20905},
url = {https://mlanthology.org/aaai/2022/zhao2022aaai-error/}
}