Aggregation of Multiple Knockoffs

Abstract

We develop an extension of the knockoff inference procedure, introduced by Barber & Candes (2015). This new method, called Aggregation of Multiple Knockoffs (AKO), addresses the instability inherent to the random nature of knockoff-based inference. Specifically, AKO improves both the stability and power compared with the original knockoff algorithm while still maintaining guarantees for false discovery rate control. We provide a new inference procedure, prove its core properties, and demonstrate its benefits in a set of experiments on synthetic and real datasets.

Cite

Text

Nguyen et al. "Aggregation of Multiple Knockoffs." International Conference on Machine Learning, 2020.

Markdown

[Nguyen et al. "Aggregation of Multiple Knockoffs." International Conference on Machine Learning, 2020.](https://mlanthology.org/icml/2020/nguyen2020icml-aggregation/)

BibTeX

@inproceedings{nguyen2020icml-aggregation,
  title     = {{Aggregation of Multiple Knockoffs}},
  author    = {Nguyen, Tuan-Binh and Chevalier, Jerome-Alexis and Thirion, Bertrand and Arlot, Sylvain},
  booktitle = {International Conference on Machine Learning},
  year      = {2020},
  pages     = {7283-7293},
  volume    = {119},
  url       = {https://mlanthology.org/icml/2020/nguyen2020icml-aggregation/}
}