A New Paradigm for Counterfactual Reasoning in Fairness and Recourse

Abstract

Federated stochastic bilevel optimization has been actively studied in recent years due to its widespread applications in machine learning. However, most existing federated stochastic bilevel optimization algorithms require the computation of second-order Hessian and Jacobian matrices, which leads to longer running times in practice. To address these challenges, we propose a novel federated stochastic variance-reduced bilevel gradient descent algorithm that relies solely on first-order oracles. Specifically, our approach does not require the computation of second-order Hessian and Jacobian matrices, significantly reducing running time. Furthermore, we introduce a novel learning rate mechanism, i.e., a constant single-time-scale learning rate, to coordinate the update of different variables. We also present a new strategy to establish the convergence rate of our algorithm. Finally, the extensive experimental results confirm the efficacy of our proposed algorithm.

Cite

Text

Bynum et al. "A New Paradigm for Counterfactual Reasoning in Fairness and Recourse." International Joint Conference on Artificial Intelligence, 2024. doi:10.24963/ijcai.2024/784

Markdown

[Bynum et al. "A New Paradigm for Counterfactual Reasoning in Fairness and Recourse." International Joint Conference on Artificial Intelligence, 2024.](https://mlanthology.org/ijcai/2024/bynum2024ijcai-new/) doi:10.24963/ijcai.2024/784

BibTeX

@inproceedings{bynum2024ijcai-new,
  title     = {{A New Paradigm for Counterfactual Reasoning in Fairness and Recourse}},
  author    = {Bynum, Lucius E. J. and Loftus, Joshua R. and Stoyanovich, Julia},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2024},
  pages     = {7092-7100},
  doi       = {10.24963/ijcai.2024/784},
  url       = {https://mlanthology.org/ijcai/2024/bynum2024ijcai-new/}
}