Nonconvex Optimization for Regression with Fairness Constraints

Abstract

The unfairness of a regressor is evaluated by measuring the correlation between the estimator and the sensitive attribute (e.g., race, gender, age), and the coefficient of determination (CoD) is a natural extension of the correlation coefficient when more than one sensitive attribute exists. As is well known, there is a trade-off between fairness and accuracy of a regressor, which implies a perfectly fair optimizer does not always yield a useful prediction. Taking this into consideration, we optimize the accuracy of the estimation subject to a user-defined level of fairness. However, a fairness level as a constraint induces a nonconvexity of the feasible region, which disables the use of an off-the-shelf convex optimizer. Despite such nonconvexity, we show an exact solution is available by using tools of global optimization theory. Furthermore, we propose a nonlinear extension of the method by kernel representation. Unlike most of existing fairness-aware machine learning methods, our method allows us to deal with numeric and multiple sensitive attributes.

Cite

Text

Komiyama et al. "Nonconvex Optimization for Regression with Fairness Constraints." International Conference on Machine Learning, 2018.

Markdown

[Komiyama et al. "Nonconvex Optimization for Regression with Fairness Constraints." International Conference on Machine Learning, 2018.](https://mlanthology.org/icml/2018/komiyama2018icml-nonconvex/)

BibTeX

@inproceedings{komiyama2018icml-nonconvex,
  title     = {{Nonconvex Optimization for Regression with Fairness Constraints}},
  author    = {Komiyama, Junpei and Takeda, Akiko and Honda, Junya and Shimao, Hajime},
  booktitle = {International Conference on Machine Learning},
  year      = {2018},
  pages     = {2737-2746},
  volume    = {80},
  url       = {https://mlanthology.org/icml/2018/komiyama2018icml-nonconvex/}
}