Dropout-Based Rashomon Set Exploration for Efficient Predictive Multiplicity Estimation
Abstract
Predictive multiplicity refers to the phenomenon in which classification tasks may admit multiple competing models that achieve almost-equally-optimal performance, yet generate conflicting outputs for individual samples. This presents significant concerns, as it can potentially result in systemic exclusion, inexplicable discrimination, and unfairness in practical applications. Measuring and mitigating predictive multiplicity, however, is computationally challenging due to the need to explore all such almost-equally-optimal models, known as the Rashomon set, in potentially huge hypothesis spaces. To address this challenge, we propose a novel framework that utilizes dropout techniques for exploring models in the Rashomon set. We provide rigorous theoretical derivations to connect the dropout parameters to properties of the Rashomon set, and empirically evaluate our framework through extensive experimentation. Numerical results show that our technique consistently outperforms baselines in terms of the effectiveness of predictive multiplicity metric estimation, with runtime speedup up to $20\times \sim 5000\times$. With efficient Rashomon set exploration and metric estimation, mitigation of predictive multiplicity is then achieved through dropout ensemble and model selection.
Cite
Text
Hsu et al. "Dropout-Based Rashomon Set Exploration for Efficient Predictive Multiplicity Estimation." International Conference on Learning Representations, 2024.Markdown
[Hsu et al. "Dropout-Based Rashomon Set Exploration for Efficient Predictive Multiplicity Estimation." International Conference on Learning Representations, 2024.](https://mlanthology.org/iclr/2024/hsu2024iclr-dropoutbased/)BibTeX
@inproceedings{hsu2024iclr-dropoutbased,
title = {{Dropout-Based Rashomon Set Exploration for Efficient Predictive Multiplicity Estimation}},
author = {Hsu, Hsiang and Li, Guihong and Hu, Shaohan and Chen, Chun-Fu},
booktitle = {International Conference on Learning Representations},
year = {2024},
url = {https://mlanthology.org/iclr/2024/hsu2024iclr-dropoutbased/}
}