Generalized Multi-View Model: Adaptive Density Estimation Under Low-Rank Constraints

Abstract

We study the problem of bivariate discrete or continuous probability density estimation under low-rank constraints. For discrete distributions, we assume that the two-dimensional array to estimate is a low-rank probability matrix. In the continuous case, we assume that the density with respect to the Lebesgue measure satisfies a generalized multi-view model, meaning that it is $\beta$-Hölder and can be decomposed as a sum of $K$ components, each of which is a product of one-dimensional functions. In both settings, we propose estimators that achieve, up to logarithmic factors, the minimax optimal convergence rates under such low-rank constraints. In the discrete case, the proposed estimator is adaptive to the rank $K$. In the continuous case, our estimator converges with the $L_1$ rate $\min((K/n)^{\beta/(2\beta+1)}, n^{-\beta/(2\beta+2)})$ up to logarithmic factors, and it is adaptive to the unknown support as well as to the smoothness $\beta$ and to the unknown number of separable components $K$. We present efficient algorithms to compute our estimators.

Cite

Text

Chhor et al. "Generalized Multi-View Model: Adaptive Density Estimation Under Low-Rank Constraints." Journal of Machine Learning Research, 2025.

Markdown

[Chhor et al. "Generalized Multi-View Model: Adaptive Density Estimation Under Low-Rank Constraints." Journal of Machine Learning Research, 2025.](https://mlanthology.org/jmlr/2025/chhor2025jmlr-generalized/)

BibTeX

@article{chhor2025jmlr-generalized,
  title     = {{Generalized Multi-View Model: Adaptive Density Estimation Under Low-Rank Constraints}},
  author    = {Chhor, Julien and Klopp, Olga and Tsybakov, Alexandre B.},
  journal   = {Journal of Machine Learning Research},
  year      = {2025},
  pages     = {1-52},
  volume    = {26},
  url       = {https://mlanthology.org/jmlr/2025/chhor2025jmlr-generalized/}
}