Bounds on $L_p$ Errors in Density Ratio Estimation via $f$-Divergence Loss Functions

Abstract

Density ratio estimation (DRE) is a core technique in machine learning used to capture relationships between two probability distributions. $f$-divergence loss functions, which are derived from variational representations of $f$-divergence, have become a standard choice in DRE for achieving cutting-edge performance. This study provides novel theoretical insights into DRE by deriving upper and lower bounds on the $L_p$ errors through $f$-divergence loss functions. These bounds apply to any estimator belonging to a class of Lipschitz continuous estimators, irrespective of the specific $f$-divergence loss function employed. The derived bounds are expressed as a product involving the data dimensionality and the expected value of the density ratio raised to the $p$-th power. Notably, the lower bound includes an exponential term that depends on the Kullback--Leibler (KL) divergence, revealing that the $L_p$ error increases significantly as the KL divergence grows when $p > 1$. This increase becomes even more pronounced as the value of $p$ grows. The theoretical insights are validated through numerical experiments.

Cite

Text

Kitazawa. "Bounds on $L_p$ Errors in Density Ratio Estimation via $f$-Divergence Loss Functions." International Conference on Learning Representations, 2025.

Markdown

[Kitazawa. "Bounds on $L_p$ Errors in Density Ratio Estimation via $f$-Divergence Loss Functions." International Conference on Learning Representations, 2025.](https://mlanthology.org/iclr/2025/kitazawa2025iclr-bounds/)

BibTeX

@inproceedings{kitazawa2025iclr-bounds,
  title     = {{Bounds on $L_p$ Errors in Density Ratio Estimation via $f$-Divergence Loss Functions}},
  author    = {Kitazawa, Yoshiaki},
  booktitle = {International Conference on Learning Representations},
  year      = {2025},
  url       = {https://mlanthology.org/iclr/2025/kitazawa2025iclr-bounds/}
}