Near-Minimax Recursive Density Estimation on the Binary Hypercube
Abstract
This paper describes a recursive estimation procedure for multivariate binary densities using orthogonal expansions. For $d$ covariates, there are $2^d$ basis coefficients to estimate, which renders conventional approaches computationally prohibitive when $d$ is large. However, for a wide class of densities that satisfy a certain sparsity condition, our estimator runs in probabilistic polynomial time and adapts to the unknown sparsity of the underlying density in two key ways: (1) it attains near-minimax mean-squared error, and (2) the computational complexity is lower for sparser densities. Our method also allows for flexible control of the trade-off between mean-squared error and computational complexity.
Cite
Text
Raginsky et al. "Near-Minimax Recursive Density Estimation on the Binary Hypercube." Neural Information Processing Systems, 2008.Markdown
[Raginsky et al. "Near-Minimax Recursive Density Estimation on the Binary Hypercube." Neural Information Processing Systems, 2008.](https://mlanthology.org/neurips/2008/raginsky2008neurips-nearminimax/)BibTeX
@inproceedings{raginsky2008neurips-nearminimax,
title = {{Near-Minimax Recursive Density Estimation on the Binary Hypercube}},
author = {Raginsky, Maxim and Lazebnik, Svetlana and Willett, Rebecca and Silva, Jorge},
booktitle = {Neural Information Processing Systems},
year = {2008},
pages = {1305-1312},
url = {https://mlanthology.org/neurips/2008/raginsky2008neurips-nearminimax/}
}