SPRI: Aligning Large Language Models with Context-Situated Principles

Abstract

Aligning Large Language Models to integrate and reflect human values, especially for tasks that demand intricate human oversight, is arduous since it is resource-intensive and time-consuming to depend on human expertise for context-specific guidance. Prior work has utilized predefined sets of rules or principles to steer the behavior of models (Bai et al., 2022; Sun et al., 2023). However, these principles tend to be generic, making it challenging to adapt them to each individual input query or context. In this work, we present Situated-PRInciples (SPRI), a framework requiring minimal or no human effort that is designed to automatically generate guiding principles in real-time for each input query and utilize them to align each response. We evaluate SPRI on three tasks, and show that 1) SPRI can derive principles in a complex domain-specific task that leads to on-par performance as expert-crafted ones; 2) SPRI-generated principles lead to instance-specific rubrics that outperform prior LLM-as-a-judge frameworks; 3) using SPRI to generate synthetic SFT data leads to substantial improvement on truthfulness.

Cite

Text

Zhan et al. "SPRI: Aligning Large Language Models with Context-Situated Principles." Proceedings of the 42nd International Conference on Machine Learning, 2025.

Markdown

[Zhan et al. "SPRI: Aligning Large Language Models with Context-Situated Principles." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/zhan2025icml-spri/)

BibTeX

@inproceedings{zhan2025icml-spri,
  title     = {{SPRI: Aligning Large Language Models with Context-Situated Principles}},
  author    = {Zhan, Hongli and Azmat, Muneeza and Horesh, Raya and Li, Junyi Jessy and Yurochkin, Mikhail},
  booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
  year      = {2025},
  pages     = {74370-74405},
  volume    = {267},
  url       = {https://mlanthology.org/icml/2025/zhan2025icml-spri/}
}