Variance Estimation After Kernel Ridge Regression Imputation

Abstract

Imputation is a popular technique for handling missing data. Variance estimation after imputation is an important practical problem in statistics. In this paper, we consider variance estimation of the imputed mean estimator under the kernel ridge regression imputation. We consider a linearization approach which employs the covariate balancing idea to estimate the inverse of propensity scores. The statistical guarantee of our proposed variance estimation is studied when a Sobolev space is utilized to do the imputation, where $\sqrt{n}$-consistency can be obtained. Synthetic data experiments are presented to confirm our theory.

Cite

Text

Wang and Kim. "Variance Estimation After Kernel Ridge Regression Imputation." ICML 2020 Workshops: Artemiss, 2020.

Markdown

[Wang and Kim. "Variance Estimation After Kernel Ridge Regression Imputation." ICML 2020 Workshops: Artemiss, 2020.](https://mlanthology.org/icmlw/2020/wang2020icmlw-variance/)

BibTeX

@inproceedings{wang2020icmlw-variance,
  title     = {{Variance Estimation After Kernel Ridge Regression Imputation}},
  author    = {Wang, Hengfang and Kim, Jae Kwang},
  booktitle = {ICML 2020 Workshops: Artemiss},
  year      = {2020},
  url       = {https://mlanthology.org/icmlw/2020/wang2020icmlw-variance/}
}