Sharp Optimality of Simple, Plug-in Estimation of the Fisher Information of a Smoothed Density
Abstract
Given independent and identically distributed data from a compactly supported, $\alpha$-Hölder density $f$, we study estimation of the Fisher information of the Gaussian-smoothed density $f*\varphi_t$, where $\varphi_t$ is the density of $N(0, t)$. We derive the minimax rate including the sharp dependence on $t$ and show some simple, plug-in type estimators are optimal for $t > 0$, even though extra debiasing steps are widely employed in the literature to achieve the sharp rate in the unsmoothed ($t = 0$) case. Due to our result’s sharp characterization of the scaling in $t$, plug-in estimators of the mutual information and entropy are shown to achieve the parametric rate by way of the I-MMSE and de Bruijn’s identities.
Cite
Text
Kotekal. "Sharp Optimality of Simple, Plug-in Estimation of the Fisher Information of a Smoothed Density." Proceedings of the 42nd International Conference on Machine Learning, 2025.Markdown
[Kotekal. "Sharp Optimality of Simple, Plug-in Estimation of the Fisher Information of a Smoothed Density." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/kotekal2025icml-sharp/)BibTeX
@inproceedings{kotekal2025icml-sharp,
title = {{Sharp Optimality of Simple, Plug-in Estimation of the Fisher Information of a Smoothed Density}},
author = {Kotekal, Subhodh},
booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
year = {2025},
pages = {31590-31618},
volume = {267},
url = {https://mlanthology.org/icml/2025/kotekal2025icml-sharp/}
}