Minimax Estimation of Laplacian Constrained Precision Matrices

Abstract

This paper considers the problem of high-dimensional sparse precision matrix estimation under Laplacian constraints. We prove that the Laplacian constraints bring favorable properties for estimation: the Gaussian maximum likelihood estimator exists and is unique almost surely on the basis of one observation, irrespective of the dimension. We establish the optimal rate of convergence under Frobenius norm by the derivation of the minimax lower and upper bounds. The minimax lower bound is obtained by applying Le Cam-Assouad’s method with a novel construction of a subparameter space of multivariate normal distributions. The minimax upper bound is established by designing an adaptive $\ell_1$-norm regularized maximum likelihood estimation method and quantifying the rate of convergence. We prove that the proposed estimator attains the optimal rate of convergence with an overwhelming probability. Numerical experiments demonstrate the effectiveness of the proposed estimator.

Cite

Text

Ying et al. "Minimax Estimation of Laplacian Constrained Precision Matrices." Artificial Intelligence and Statistics, 2021.

Markdown

[Ying et al. "Minimax Estimation of Laplacian Constrained Precision Matrices." Artificial Intelligence and Statistics, 2021.](https://mlanthology.org/aistats/2021/ying2021aistats-minimax/)

BibTeX

@inproceedings{ying2021aistats-minimax,
  title     = {{Minimax Estimation of Laplacian Constrained Precision Matrices}},
  author    = {Ying, Jiaxi and Miranda Cardoso, José and Palomar, Daniel},
  booktitle = {Artificial Intelligence and Statistics},
  year      = {2021},
  pages     = {3736-3744},
  volume    = {130},
  url       = {https://mlanthology.org/aistats/2021/ying2021aistats-minimax/}
}