DSPy: Compiling Declarative Language Model Calls into Self-Improving Pipelines
Abstract
The ML community is rapidly exploring techniques for prompting language models (LMs), but existing LM pipelines often rely on hard-coded “prompt templates” discovered via trial and error. We introduce DSPy, a programming model that abstracts LM pipelines as imperative computation graphs where LMs are invoked through declarative modules. DSPy modules are parameterized so they can learn to apply compositions of prompting, finetuning, augmentation, and reasoning techniques. We design a compiler that will optimize any DSPy pipeline to maximize a given metric. We conduct two case studies and show that a few lines of DSPy allow GPT-3.5 and llama2-13b-chat to self-bootstrap pipelines that outperform standard few-shot prompting and pipelines with expert-created demonstrations.
Cite
Text
Khattab et al. "DSPy: Compiling Declarative Language Model Calls into Self-Improving Pipelines." NeurIPS 2023 Workshops: R0-FoMo, 2023.Markdown
[Khattab et al. "DSPy: Compiling Declarative Language Model Calls into Self-Improving Pipelines." NeurIPS 2023 Workshops: R0-FoMo, 2023.](https://mlanthology.org/neuripsw/2023/khattab2023neuripsw-dspy/)BibTeX
@inproceedings{khattab2023neuripsw-dspy,
title = {{DSPy: Compiling Declarative Language Model Calls into Self-Improving Pipelines}},
author = {Khattab, Omar and Singhvi, Arnav and Maheshwari, Paridhi and Zhang, Zhiyuan and Santhanam, Keshav and A, Sri Vardhamanan and Haq, Saiful and Sharma, Ashutosh and Joshi, Thomas T. and Moazam, Hanna and Miller, Heather and Zaharia, Matei and Potts, Christopher},
booktitle = {NeurIPS 2023 Workshops: R0-FoMo},
year = {2023},
url = {https://mlanthology.org/neuripsw/2023/khattab2023neuripsw-dspy/}
}