When Privacy Meets Partial Information: A Refined Analysis of Differentially Private Bandits

Abstract

We study the problem of multi-armed bandits with ε-global Differential Privacy (DP). First, we prove the minimax and problem-dependent regret lower bounds for stochastic and linear bandits that quantify the hardness of bandits with ε-global DP. These bounds suggest the existence of two hardness regimes depending on the privacy budget ε. In the high-privacy regime (small ε), the hardness depends on a coupled effect of privacy and partial information about the reward distributions. In the low-privacy regime (large ε), bandits with ε-global DP are not harder than the bandits without privacy. For stochastic bandits, we further propose a generic framework to design a near-optimal ε global DP extension of an index-based optimistic bandit algorithm. The framework consists of three ingredients: the Laplace mechanism, arm-dependent adaptive episodes, and usage of only the rewards collected in the last episode for computing private statistics. Specifically, we instantiate ε-global DP extensions of UCB and KL-UCB algorithms, namely AdaP-UCB and AdaP-KLUCB. AdaP-KLUCB is the first algorithm that both satisfies ε-global DP and yields a regret upper bound that matches the problem-dependent lower bound up to multiplicative constants.

Cite

Text

Azize and Basu. "When Privacy Meets Partial Information: A Refined Analysis of Differentially Private Bandits." Neural Information Processing Systems, 2022.

Markdown

[Azize and Basu. "When Privacy Meets Partial Information: A Refined Analysis of Differentially Private Bandits." Neural Information Processing Systems, 2022.](https://mlanthology.org/neurips/2022/azize2022neurips-privacy/)

BibTeX

@inproceedings{azize2022neurips-privacy,
  title     = {{When Privacy Meets Partial Information: A Refined Analysis of Differentially Private Bandits}},
  author    = {Azize, Achraf and Basu, Debabrota},
  booktitle = {Neural Information Processing Systems},
  year      = {2022},
  url       = {https://mlanthology.org/neurips/2022/azize2022neurips-privacy/}
}