Computationally Efficient Reductions Between Some Statistical Models
Abstract
We study the problem of approximately transforming a sample from a source statistical model to a sample from a target statistical model without knowing the parameters of the source model, and construct several computationally efficient such reductions between canonical statistical experiments. In particular, we provide computationally efficient procedures that approximately reduce uniform, Erlang, and Laplace location models to general target families. We illustrate our methodology by establishing nonasymptotic reductions between some canonical high-dimensional problems, spanning mixtures of experts, phase retrieval, and signal denoising. Notably, the reductions are structure-preserving and can accommodate missing data. We also point to a possible application in transforming one differentially private mechanism to another.
Cite
Text
Lou et al. "Computationally Efficient Reductions Between Some Statistical Models." Proceedings of The 36th International Conference on Algorithmic Learning Theory, 2025.Markdown
[Lou et al. "Computationally Efficient Reductions Between Some Statistical Models." Proceedings of The 36th International Conference on Algorithmic Learning Theory, 2025.](https://mlanthology.org/alt/2025/lou2025alt-computationally/)BibTeX
@inproceedings{lou2025alt-computationally,
title = {{Computationally Efficient Reductions Between Some Statistical Models}},
author = {Lou, Mengqi and Bresler, Guy and Pananjady, Ashwin},
booktitle = {Proceedings of The 36th International Conference on Algorithmic Learning Theory},
year = {2025},
pages = {771-771},
volume = {272},
url = {https://mlanthology.org/alt/2025/lou2025alt-computationally/}
}