Simple and Near-Optimal Algorithms for Hidden Stratification and Multi-Group Learning
Abstract
Multi-group agnostic learning is a formal learning criterion that is concerned with the conditional risks of predictors within subgroups of a population. The criterion addresses recent practical concerns such as subgroup fairness and hidden stratification. This paper studies the structure of solutions to the multi-group learning problem, and provides simple and near-optimal algorithms for the learning problem.
Cite
Text
Tosh and Hsu. "Simple and Near-Optimal Algorithms for Hidden Stratification and Multi-Group Learning." International Conference on Machine Learning, 2022.Markdown
[Tosh and Hsu. "Simple and Near-Optimal Algorithms for Hidden Stratification and Multi-Group Learning." International Conference on Machine Learning, 2022.](https://mlanthology.org/icml/2022/tosh2022icml-simple/)BibTeX
@inproceedings{tosh2022icml-simple,
title = {{Simple and Near-Optimal Algorithms for Hidden Stratification and Multi-Group Learning}},
author = {Tosh, Christopher J and Hsu, Daniel},
booktitle = {International Conference on Machine Learning},
year = {2022},
pages = {21633-21657},
volume = {162},
url = {https://mlanthology.org/icml/2022/tosh2022icml-simple/}
}