Differentially Private Approximate near Neighbor Counting in High Dimensions
Abstract
Range counting (e.g., counting the number of data points falling into a given query ball) under differential privacy has been studied extensively. However, the current algorithms for this problem are subject to the following dichotomy. One class of algorithms suffers from an additive error that is a fixed polynomial in the number of points. Another class of algorithms allows for polylogarithmic additive error, but the error grows exponentially in the dimension. To achieve the latter, the problem is relaxed to allow a “fuzzy” definition of the range boundary, e.g., a count of the points in a ball of radius $r$ might also include points in a ball of radius $cr$ for some $c>1$. In this paper we present an efficient algorithm that offers a sweet spot between these two classes. The algorithm has an additive error that is an arbitrary small power of the data set size, depending on how fuzzy the range boundary is, as well as a small ($1+o(1)$) multiplicative error. Crucially, the amount of noise added has no dependence on the dimension. Our algorithm introduces a variant of Locality-Sensitive Hashing, utilizing it in a novel manner.
Cite
Text
Andoni et al. "Differentially Private Approximate near Neighbor Counting in High Dimensions." Neural Information Processing Systems, 2023.Markdown
[Andoni et al. "Differentially Private Approximate near Neighbor Counting in High Dimensions." Neural Information Processing Systems, 2023.](https://mlanthology.org/neurips/2023/andoni2023neurips-differentially/)BibTeX
@inproceedings{andoni2023neurips-differentially,
title = {{Differentially Private Approximate near Neighbor Counting in High Dimensions}},
author = {Andoni, Alexandr and Indyk, Piotr and Mahabadi, Sepideh and Narayanan, Shyam},
booktitle = {Neural Information Processing Systems},
year = {2023},
url = {https://mlanthology.org/neurips/2023/andoni2023neurips-differentially/}
}