High Dimensional Inference in Partially Linear Models

Abstract

We propose two semiparametric versions of the debiased Lasso procedure for the model $Y_{i}=X_{i}\beta_{0}+g_{0}(Z_{i})+\varepsilon_{i}$, where the parameter vector of interest $\beta_{0}$ is high dimensional but sparse (exactly or approximately) and $g_{0}$ is an unknown nuisance function. Both versions are shown to have the same asymptotic normal distribution and do not require the minimal signal condition for statistical inference of any component in $\beta_{0}$. We further develop a simultaneous hypothesis testing procedure based on multiplier bootstrap. Our testing method takes into account of the dependence structure within the debiased estimates, and allows the number of tested components to be exponentially high.

Cite

Text

Zhu et al. "High Dimensional Inference in Partially Linear Models." Artificial Intelligence and Statistics, 2019.

Markdown

[Zhu et al. "High Dimensional Inference in Partially Linear Models." Artificial Intelligence and Statistics, 2019.](https://mlanthology.org/aistats/2019/zhu2019aistats-high/)

BibTeX

@inproceedings{zhu2019aistats-high,
  title     = {{High Dimensional Inference in Partially Linear Models}},
  author    = {Zhu, Ying and Yu, Zhuqing and Cheng, Guang},
  booktitle = {Artificial Intelligence and Statistics},
  year      = {2019},
  pages     = {2760-2769},
  volume    = {89},
  url       = {https://mlanthology.org/aistats/2019/zhu2019aistats-high/}
}