Algorithmic Aspects of the Log-Laplace Transform and a Non-Euclidean Proximal Sampler
Abstract
The development of efficient sampling algorithms catering to non-Euclidean geometries has been a challenging endeavor, as discretization techniques which succeed in the Euclidean setting do not readily carry over to more general settings. We develop a non-Euclidean analog of the recent proximal sampler of [LST21], which naturally induces regularization by an object known as the log-Laplace transform (LLT) of a density. We prove new mathematical properties (with an algorithmic flavor) of the LLT, such as strong convexity-smoothness duality and an isoperimetric inequality, which are used to prove a mixing time on our proximal sampler matching [LST21] under a warm start. As our main application, we show our warm-started sampler improves the value oracle complexity of differentially private convex optimization in $\ell_p$ and Schatten-$p$ norms for $p \in [1, 2]$ to match the Euclidean setting [GLL22], while retaining state-of-the-art excess risk bounds [GLLST23]. We find our investigation of the LLT to be a promising proof-of-concept of its utility as a tool for designing samplers, and outline directions for future exploration.
Cite
Text
Gopi et al. "Algorithmic Aspects of the Log-Laplace Transform and a Non-Euclidean Proximal Sampler." Conference on Learning Theory, 2023.Markdown
[Gopi et al. "Algorithmic Aspects of the Log-Laplace Transform and a Non-Euclidean Proximal Sampler." Conference on Learning Theory, 2023.](https://mlanthology.org/colt/2023/gopi2023colt-algorithmic/)BibTeX
@inproceedings{gopi2023colt-algorithmic,
title = {{Algorithmic Aspects of the Log-Laplace Transform and a Non-Euclidean Proximal Sampler}},
author = {Gopi, Sivakanth and Lee, Yin Tat and Liu, Daogao and Shen, Ruoqi and Tian, Kevin},
booktitle = {Conference on Learning Theory},
year = {2023},
pages = {2399-2439},
volume = {195},
url = {https://mlanthology.org/colt/2023/gopi2023colt-algorithmic/}
}