Gone with the Bits: Benchmarking Bias in Facial Phenotype Degradation Under Low-Rate Neural Compression
Abstract
In this study, we investigate how facial phenotypes are distorted under neural image compression and the disparity of this distortion across racial groups. Neural compression methods are gaining popularity due to their impressive rate-distortion performance and their ability to compress to extremely small bitrates, below 0.1 bits per pixel (bpp). As deep learning architectures, these models are prone to bias during the training process, leading to unfair outcomes for individuals in different groups. We first demonstrate, by benchmarking five popular neural compression algorithms, that compressing facial images to low bitrate regimes leads to the degradation of specific phenotypes (e.g. skin type). Next, we highlight the bias in this phenotype degradation across different race groups. We then show that leveraging a racially balanced dataset does not help mitigate this bias. Finally, we examine the relationship between bias and realism of reconstructed images at different bitrates.
Cite
Text
Qiu et al. "Gone with the Bits: Benchmarking Bias in Facial Phenotype Degradation Under Low-Rate Neural Compression." ICML 2024 Workshops: NextGenAISafety, 2024.Markdown
[Qiu et al. "Gone with the Bits: Benchmarking Bias in Facial Phenotype Degradation Under Low-Rate Neural Compression." ICML 2024 Workshops: NextGenAISafety, 2024.](https://mlanthology.org/icmlw/2024/qiu2024icmlw-gone/)BibTeX
@inproceedings{qiu2024icmlw-gone,
title = {{Gone with the Bits: Benchmarking Bias in Facial Phenotype Degradation Under Low-Rate Neural Compression}},
author = {Qiu, Tian and Nichani, Arjun and Tadayon, Rasta and Jeong, Haewon},
booktitle = {ICML 2024 Workshops: NextGenAISafety},
year = {2024},
url = {https://mlanthology.org/icmlw/2024/qiu2024icmlw-gone/}
}