Fast Instrument Learning with Faster Rates
Abstract
We investigate nonlinear instrumental variable (IV) regression given high-dimensional instruments. We propose a simple algorithm which combines kernelized IV methods and an arbitrary, adaptive regression algorithm, accessed as a black box. Our algorithm enjoys faster-rate convergence and adapts to the dimensionality of informative latent features, while avoiding an expensive minimax optimization procedure, which has been necessary to establish similar guarantees. It further brings the benefit of flexible machine learning models to quasi-Bayesian uncertainty quantification, likelihood-based model selection, and model averaging. Simulation studies demonstrate the competitive performance of our method.
Cite
Text
Wang et al. "Fast Instrument Learning with Faster Rates." Neural Information Processing Systems, 2022.Markdown
[Wang et al. "Fast Instrument Learning with Faster Rates." Neural Information Processing Systems, 2022.](https://mlanthology.org/neurips/2022/wang2022neurips-fast/)BibTeX
@inproceedings{wang2022neurips-fast,
title = {{Fast Instrument Learning with Faster Rates}},
author = {Wang, Ziyu and Zhou, Yuhao and Zhu, Jun},
booktitle = {Neural Information Processing Systems},
year = {2022},
url = {https://mlanthology.org/neurips/2022/wang2022neurips-fast/}
}