Bias Unveiled: Investigating Social Bias in LLM-Generated Code
Abstract
Large language models (LLMs) have significantly advanced the field of automated code generation. However, a notable research gap exists in evaluating social biases that may be present in the code produced by LLMs. To solve this issue, we propose a novel fairness framework, i.e., Solar, to assess and mitigate the social biases of LLM-generated code. Specifically, Solar can automatically generate test cases for quantitatively uncovering social biases of the auto-generated code by LLMs. To quantify the severity of social biases in generated code, we develop a dataset that covers a diverse set of social problems. We applied Solar and the crafted dataset to four state-of-the-art LLMs for code generation. Our evaluation reveals severe bias in the LLM-generated code from all the subject LLMs. Furthermore, we explore several prompting strategies for mitigating bias, including Chain-of-Thought (CoT) prompting, combining positive role-playing with CoT prompting and dialogue with Solar. Our experiments show that dialogue with Solar can effectively reduce social bias in LLM-generated code by up to 90%. Last, we make the code and data publicly available is highly extensible to evaluate new social problems.
Cite
Text
Ling et al. "Bias Unveiled: Investigating Social Bias in LLM-Generated Code." AAAI Conference on Artificial Intelligence, 2025. doi:10.1609/AAAI.V39I26.34961Markdown
[Ling et al. "Bias Unveiled: Investigating Social Bias in LLM-Generated Code." AAAI Conference on Artificial Intelligence, 2025.](https://mlanthology.org/aaai/2025/ling2025aaai-bias/) doi:10.1609/AAAI.V39I26.34961BibTeX
@inproceedings{ling2025aaai-bias,
title = {{Bias Unveiled: Investigating Social Bias in LLM-Generated Code}},
author = {Ling, Lin and Rabbi, Fazle and Wang, Song and Yang, Jinqiu},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2025},
pages = {27491-27499},
doi = {10.1609/AAAI.V39I26.34961},
url = {https://mlanthology.org/aaai/2025/ling2025aaai-bias/}
}