General Staircase Mechanisms for Optimal Differential Privacy
Abstract
We derive the optimal differentially private additive noise mechanism for queries in $\mathbb{R}^d$ when sensitivity and error are defined by an arbitrary norm $||\cdot||_K$. The optimal mechanism is a generalization of the staircase mechanism, which is known to be optimal under the $\ell_1$ norm when $d \leq 2$; we extend the mechanism and its guarantee to arbitrary norms and dimensions, proving a conjecture of Geng et al. [2015] along the way. The generalized staircase mechanism we derive can be viewed as a refinement of the $K$-norm mechanism of Hardt and Talwar [2010] , with improvements particularly evident in the low-privacy regime as $\epsilon \to \infty$. We show how to implement the generalized staircase mechanism efficiently, given an efficient algorithm for sampling the unit $K$-norm ball, and demonstrate that it significantly reduces error in realistic settings, including under non-standard norms that are common in practice, and across a range of error metrics.
Cite
Text
Kulesza et al. "General Staircase Mechanisms for Optimal Differential Privacy." Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, 2025.Markdown
[Kulesza et al. "General Staircase Mechanisms for Optimal Differential Privacy." Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, 2025.](https://mlanthology.org/aistats/2025/kulesza2025aistats-general/)BibTeX
@inproceedings{kulesza2025aistats-general,
title = {{General Staircase Mechanisms for Optimal Differential Privacy}},
author = {Kulesza, Alex and Suresh, Ananda Theertha and Wang, Yuyan},
booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics},
year = {2025},
pages = {4564-4572},
volume = {258},
url = {https://mlanthology.org/aistats/2025/kulesza2025aistats-general/}
}