Incentivizing Recourse Through Auditing in Strategic Classification

Abstract

The increasing automation of high-stakes decisions with direct impact on the lives and well-being of individuals raises a number of important considerations. Prominent among these is strategic behavior by individuals hoping to achieve a more desirable outcome. Two forms of such behavior are commonly studied: 1) misreporting of individual attributes, and 2) recourse, or actions that truly change such attributes. The former involves deception, and is inherently undesirable, whereas the latter may well be a desirable goal insofar as it changes true individual qualification. We study misreporting and recourse as strategic choices by individuals within a unified framework. In particular, we propose auditing as a means to incentivize recourse actions over attribute manipulation, and characterize optimal audit policies for two types of principals, utility-maximizing and recourse-maximizing. Additionally, we consider subsidies as an incentive for recourse over manipulation, and show that even a utility-maximizing principal would be willing to devote a considerable amount of audit budget to providing such subsidies. Finally, we consider the problem of optimizing fines for failed audits, and bound the total cost incurred by the population as a result of audits.

Cite

Text

Estornell et al. "Incentivizing Recourse Through Auditing in Strategic Classification." International Joint Conference on Artificial Intelligence, 2023. doi:10.24963/IJCAI.2023/45

Markdown

[Estornell et al. "Incentivizing Recourse Through Auditing in Strategic Classification." International Joint Conference on Artificial Intelligence, 2023.](https://mlanthology.org/ijcai/2023/estornell2023ijcai-incentivizing/) doi:10.24963/IJCAI.2023/45

BibTeX

@inproceedings{estornell2023ijcai-incentivizing,
  title     = {{Incentivizing Recourse Through Auditing in Strategic Classification}},
  author    = {Estornell, Andrew and Chen, Yatong and Das, Sanmay and Liu, Yang and Vorobeychik, Yevgeniy},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2023},
  pages     = {400-408},
  doi       = {10.24963/IJCAI.2023/45},
  url       = {https://mlanthology.org/ijcai/2023/estornell2023ijcai-incentivizing/}
}