Justice or Prejudice? Quantifying Biases in LLM-as-a-Judge

Abstract

LLM-as-a-Judge has been widely utilized as an evaluation method in various benchmarks and served as supervised rewards in model training. However, despite their excellence in many domains, potential issues are under-explored, undermining their reliability and the scope of their utility. Therefore, we identify 12 key potential biases and propose a new automated bias quantification framework—CALM—which systematically quantifies and analyzes each type of bias in LLM-as-a-Judge by using automated and principle-guided modification. Our experiments cover multiple popular language models, and the results indicate that while advanced models have achieved commendable overall performance, significant biases persist in certain specific tasks. Empirical results suggest that there remains room for improvement in the reliability of LLM-as-a-Judge. Moreover, we also discuss the explicit and implicit influence of these biases and give some suggestions for the reliable application of LLM-as-a-Judge. Our work highlights the need for stakeholders to address these issues and remind users to exercise caution in LLM-as-a-Judge applications.

Cite

Text

Ye et al. "Justice or Prejudice? Quantifying Biases in LLM-as-a-Judge." International Conference on Learning Representations, 2025.

Markdown

[Ye et al. "Justice or Prejudice? Quantifying Biases in LLM-as-a-Judge." International Conference on Learning Representations, 2025.](https://mlanthology.org/iclr/2025/ye2025iclr-justice/)

BibTeX

@inproceedings{ye2025iclr-justice,
  title     = {{Justice or Prejudice? Quantifying Biases in LLM-as-a-Judge}},
  author    = {Ye, Jiayi and Wang, Yanbo and Huang, Yue and Chen, Dongping and Zhang, Qihui and Moniz, Nuno and Gao, Tian and Geyer, Werner and Huang, Chao and Chen, Pin-Yu and Chawla, Nitesh V and Zhang, Xiangliang},
  booktitle = {International Conference on Learning Representations},
  year      = {2025},
  url       = {https://mlanthology.org/iclr/2025/ye2025iclr-justice/}
}