Metric-Fair Classifier Derandomization

Abstract

We study the problem of classifier derandomization in machine learning: given a stochastic binary classifier $f: X \to [0,1]$, sample a deterministic classifier $\hat{f}: X \to \{0,1\}$ that approximates the output of $f$ in aggregate over any data distribution. Recent work revealed how to efficiently derandomize a stochastic classifier with strong output approximation guarantees, but at the cost of individual fairness — that is, if $f$ treated similar inputs similarly, $\hat{f}$ did not. In this paper, we initiate a systematic study of classifier derandomization with metric fairness guarantees. We show that the prior derandomization approach is almost maximally metric-unfair, and that a simple “random threshold” derandomization achieves optimal fairness preservation but with weaker output approximation. We then devise a derandomization procedure that provides an appealing tradeoff between these two: if $f$ is $\alpha$-metric fair according to a metric $d$ with a locality-sensitive hash (LSH) family, then our derandomized $\hat{f}$ is, with high probability, $O(\alpha)$-metric fair and a close approximation of $f$. We also prove generic results applicable to all (fair and unfair) classifier derandomization procedures, including a bias-variance decomposition and reductions between various notions of metric fairness.

Cite

Text

Wu et al. "Metric-Fair Classifier Derandomization." International Conference on Machine Learning, 2022.

Markdown

[Wu et al. "Metric-Fair Classifier Derandomization." International Conference on Machine Learning, 2022.](https://mlanthology.org/icml/2022/wu2022icml-metricfair/)

BibTeX

@inproceedings{wu2022icml-metricfair,
  title     = {{Metric-Fair Classifier Derandomization}},
  author    = {Wu, Jimmy and Chen, Yatong and Liu, Yang},
  booktitle = {International Conference on Machine Learning},
  year      = {2022},
  pages     = {23999-24016},
  volume    = {162},
  url       = {https://mlanthology.org/icml/2022/wu2022icml-metricfair/}
}