Locally Private Estimation with Public Features

Abstract

We initiate the study of locally differentially private (LDP) learning with public features. We define semi-feature LDP, where some features are publicly available while the remaining ones, along with the label, require protection under local differential privacy. Under semi-feature LDP, we demonstrate that the mini-max convergence rate for non-parametric regression is significantly reduced compared to that of classical LDP. Then we propose HistOfTree, an estimator that fully leverages the information contained in both public and private features. Theoretically, HistOfTree reaches the mini-max optimal convergence rate. Empirically, HistOfTree achieves superior performance on both synthetic and real data. We also explore scenarios where users have the flexibility to select features for protection manually. In such cases, we propose an estimator and a data-driven parameter tuning strategy, leading to analogous theoretical and empirical results.

Cite

Text

Ma et al. "Locally Private Estimation with Public Features." Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, 2025.

Markdown

[Ma et al. "Locally Private Estimation with Public Features." Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, 2025.](https://mlanthology.org/aistats/2025/ma2025aistats-locally/)

BibTeX

@inproceedings{ma2025aistats-locally,
  title     = {{Locally Private Estimation with Public Features}},
  author    = {Ma, Yuheng and Jia, Ke and Yang, Hanfang},
  booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics},
  year      = {2025},
  pages     = {55-63},
  volume    = {258},
  url       = {https://mlanthology.org/aistats/2025/ma2025aistats-locally/}
}