Lower Bounds and Aggregation in Density Estimation

Abstract

In this paper we prove the optimality of an aggregation procedure. We prove lower bounds for aggregation of model selection type of M density estimators for the Kullback-Leibler divergence (KL), the Hellinger's distance and the L1-distance. The lower bound, with respect to the KL distance, can be achieved by the on-line type estimate suggested, among others, by Yang (2000a). Combining these results, we state that log M/n is an optimal rate of aggregation in the sense of Tsybakov (2003), where n is the sample size.

Cite

Text

Lecué. "Lower Bounds and Aggregation in Density Estimation." Journal of Machine Learning Research, 2006.

Markdown

[Lecué. "Lower Bounds and Aggregation in Density Estimation." Journal of Machine Learning Research, 2006.](https://mlanthology.org/jmlr/2006/lecue2006jmlr-lower/)

BibTeX

@article{lecue2006jmlr-lower,
  title     = {{Lower Bounds and Aggregation in Density Estimation}},
  author    = {Lecué, Guillaume},
  journal   = {Journal of Machine Learning Research},
  year      = {2006},
  pages     = {971-981},
  volume    = {7},
  url       = {https://mlanthology.org/jmlr/2006/lecue2006jmlr-lower/}
}