Non-Parametric Inference Adaptive to Intrinsic Dimension

Abstract

We consider non-parametric estimation and inference of conditional moment models in high dimensions. We show that even when the dimension $D$ of the conditioning variable is larger than the sample size $n$, estimation and inference is feasible as long as the distribution of the conditioning variable has small intrinsic dimension $d$, as measured by locally low doubling measures. Our estimation is based on a sub-sampled ensemble of the $k$-nearest neighbors ($k$-NN) $Z$-estimator. We show that if the intrinsic dimension of the covariate distribution is equal to $d$, then the finite sample estimation error of our estimator is of order $n^{-1/(d+2)}$ and our estimate is $n^{1/(d+2)}$-asymptotically normal, irrespective of $D$. The sub-sampling size required for achieving these results depends on the unknown intrinsic dimension $d$. We propose an adaptive data-driven approach for choosing this parameter and prove that it achieves the desired rates. We discuss extensions and applications to heterogeneous treatment effect estimation.

Cite

Text

Khosravi et al. "Non-Parametric Inference Adaptive to Intrinsic Dimension." Proceedings of the First Conference on Causal Learning and Reasoning, 2022.

Markdown

[Khosravi et al. "Non-Parametric Inference Adaptive to Intrinsic Dimension." Proceedings of the First Conference on Causal Learning and Reasoning, 2022.](https://mlanthology.org/clear/2022/khosravi2022clear-nonparametric/)

BibTeX

@inproceedings{khosravi2022clear-nonparametric,
  title     = {{Non-Parametric Inference Adaptive to Intrinsic Dimension}},
  author    = {Khosravi, Khashayar and Lewis, Greg and Syrgkanis, Vasilis},
  booktitle = {Proceedings of the First Conference on Causal Learning and Reasoning},
  year      = {2022},
  pages     = {373-389},
  volume    = {177},
  url       = {https://mlanthology.org/clear/2022/khosravi2022clear-nonparametric/}
}