Risk Bounds for Mixture Density Estimation on Compact Domains via the H-Lifted Kullback–Leibler Divergence

Abstract

We consider the problem of estimating probability density functions based on sample data, using a finite mixture of densities from some component class. To this end, we introduce the $h$-lifted Kullback--Leibler~(KL) divergence as a generalization of the standard KL divergence and a criterion for conducting risk minimization. Under a compact support assumption, we prove an $\mathcal{O}(1/{\sqrt{n}})$ bound on the expected estimation error when using the $h$-lifted KL divergence, which extends the results of Rakhlin et al. (2005, ESAIM: Probability and Statistics, Vol. 9) and Li & Barron (1999, Advances in Neural Information Processing Systems, Vol. 12) to permit the risk bounding of density functions that are not strictly positive. We develop a procedure for the computation of the corresponding maximum $h$-lifted likelihood estimators ($h$-MLLEs) using the Majorization-Maximization framework and provide experimental results in support of our theoretical bounds.

Cite

Text

Chong et al. "Risk Bounds for Mixture Density Estimation on Compact Domains via the H-Lifted Kullback–Leibler Divergence." Transactions on Machine Learning Research, 2024.

Markdown

[Chong et al. "Risk Bounds for Mixture Density Estimation on Compact Domains via the H-Lifted Kullback–Leibler Divergence." Transactions on Machine Learning Research, 2024.](https://mlanthology.org/tmlr/2024/chong2024tmlr-risk/)

BibTeX

@article{chong2024tmlr-risk,
  title     = {{Risk Bounds for Mixture Density Estimation on Compact Domains via the H-Lifted Kullback–Leibler Divergence}},
  author    = {Chong, Mark Chiu and Nguyen, Hien Duy and Nguyen, TrungTin},
  journal   = {Transactions on Machine Learning Research},
  year      = {2024},
  url       = {https://mlanthology.org/tmlr/2024/chong2024tmlr-risk/}
}