Privacy, Utility and Fairness: Navigating Trade-Offs in Differentially Private Machine Learning

Abstract

Developing trustworthy AI requires advancing methods that meet key requirements such as privacy or fairness while maintaining strong utility, as well as understanding the intricate interdependencies between these dimensions, which often manifest as trade-offs. My PhD research focuses on differential privacy, which is widely regarded as the state-of-the-art for protecting privacy in data analysis and machine learning. I investigate the relationships between differential privacy, utility and fairness, with the goal of advancing the adoption of differentially private machine learning in real-world settings.

Cite

Text

Demelius. "Privacy, Utility and Fairness: Navigating Trade-Offs in Differentially Private Machine Learning." AAAI Conference on Artificial Intelligence, 2025. doi:10.1609/AAAI.V39I28.35205

Markdown

[Demelius. "Privacy, Utility and Fairness: Navigating Trade-Offs in Differentially Private Machine Learning." AAAI Conference on Artificial Intelligence, 2025.](https://mlanthology.org/aaai/2025/demelius2025aaai-privacy/) doi:10.1609/AAAI.V39I28.35205

BibTeX

@inproceedings{demelius2025aaai-privacy,
  title     = {{Privacy, Utility and Fairness: Navigating Trade-Offs in Differentially Private Machine Learning}},
  author    = {Demelius, Lea},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2025},
  pages     = {29255-29256},
  doi       = {10.1609/AAAI.V39I28.35205},
  url       = {https://mlanthology.org/aaai/2025/demelius2025aaai-privacy/}
}