High-Dimensional Location Estimation via Norm Concentration for Subgamma Vectors

Abstract

In location estimation, we are given $n$ samples from a known distribution $f$ shifted by an unknown translation $\lambda$, and want to estimate $\lambda$ as precisely as possible. Asymptotically, the maximum likelihood estimate achieves the Cramér-Rao bound of error $\mathcal N(0, \frac{1}{n\mathcal I})$, where $\mathcal I$ is the Fisher information of $f$. However, the $n$ required for convergence depends on $f$, and may be arbitrarily large. We build on the theory using smoothed estimators to bound the error for finite $n$ in terms of $\mathcal I_r$, the Fisher information of the $r$-smoothed distribution. As $n \to \infty$, $r \to 0$ at an explicit rate and this converges to the Cramér-Rao bound. We (1) improve the prior work for 1-dimensional $f$ to converge for constant failure probability in addition to high probability, and (2) extend the theory to high-dimensional distributions. In the process, we prove a new bound on the norm of a high-dimensional random variable whose 1-dimensional projections are subgamma, which may be of independent interest.

Cite

Text

Gupta et al. "High-Dimensional Location Estimation via Norm Concentration for Subgamma Vectors." International Conference on Machine Learning, 2023.

Markdown

[Gupta et al. "High-Dimensional Location Estimation via Norm Concentration for Subgamma Vectors." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/gupta2023icml-highdimensional/)

BibTeX

@inproceedings{gupta2023icml-highdimensional,
  title     = {{High-Dimensional Location Estimation via Norm Concentration for Subgamma Vectors}},
  author    = {Gupta, Shivam and Lee, Jasper C.H. and Price, Eric},
  booktitle = {International Conference on Machine Learning},
  year      = {2023},
  pages     = {12132-12164},
  volume    = {202},
  url       = {https://mlanthology.org/icml/2023/gupta2023icml-highdimensional/}
}