High Dimensional EM Algorithm: Statistical Optimization and Asymptotic Normality

Abstract

We provide a general theory of the expectation-maximization (EM) algorithm for inferring high dimensional latent variable models. In particular, we make two contributions: (i) For parameter estimation, we propose a novel high dimensional EM algorithm which naturally incorporates sparsity structure into parameter estimation. With an appropriate initialization, this algorithm converges at a geometric rate and attains an estimator with the (near-)optimal statistical rate of convergence. (ii) Based on the obtained estimator, we propose a new inferential procedure for testing hypotheses for low dimensional components of high dimensional parameters. For a broad family of statistical models, our framework establishes the first computationally feasible approach for optimal estimation and asymptotic inference in high dimensions.

Cite

Text

Wang et al. "High Dimensional EM Algorithm: Statistical Optimization and Asymptotic Normality." Neural Information Processing Systems, 2015.

Markdown

[Wang et al. "High Dimensional EM Algorithm: Statistical Optimization and Asymptotic Normality." Neural Information Processing Systems, 2015.](https://mlanthology.org/neurips/2015/wang2015neurips-high/)

BibTeX

@inproceedings{wang2015neurips-high,
  title     = {{High Dimensional EM Algorithm: Statistical Optimization and Asymptotic Normality}},
  author    = {Wang, Zhaoran and Gu, Quanquan and Ning, Yang and Liu, Han},
  booktitle = {Neural Information Processing Systems},
  year      = {2015},
  pages     = {2521-2529},
  url       = {https://mlanthology.org/neurips/2015/wang2015neurips-high/}
}