Automated Efficient Estimation Using Monte Carlo Efficient Influence Functions

Abstract

Many practical problems involve estimating low dimensional statistical quantities with high-dimensional models and datasets. Several approaches address these estimation tasks based on the theory of influence functions, such as debiased/double ML or targeted minimum loss estimation. We introduce \textit{Monte Carlo Efficient Influence Functions} (MC-EIF), a fully automated technique for approximating efficient influence functions that integrates seamlessly with existing differentiable probabilistic programming systems. MC-EIF automates efficient statistical estimation for a broad class of models and functionals that previously required rigorous custom analysis. We prove that MC-EIF is consistent, and that estimators using MC-EIF achieve optimal $\sqrt{N}$ convergence rates. We show empirically that estimators using MC-EIF are at parity with estimators using analytic EIFs. Finally, we present a novel capstone example using MC-EIF for optimal portfolio selection.

Cite

Text

Agrawal et al. "Automated Efficient Estimation Using Monte Carlo Efficient Influence Functions." Neural Information Processing Systems, 2024. doi:10.52202/079017-0513

Markdown

[Agrawal et al. "Automated Efficient Estimation Using Monte Carlo Efficient Influence Functions." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/agrawal2024neurips-automated/) doi:10.52202/079017-0513

BibTeX

@inproceedings{agrawal2024neurips-automated,
  title     = {{Automated Efficient Estimation Using Monte Carlo Efficient Influence Functions}},
  author    = {Agrawal, Raj and Witty, Sam and Zane, Andy and Bingham, Eli},
  booktitle = {Neural Information Processing Systems},
  year      = {2024},
  doi       = {10.52202/079017-0513},
  url       = {https://mlanthology.org/neurips/2024/agrawal2024neurips-automated/}
}