Compositional Exemplars for In-Context Learning
Abstract
Large pretrained language models (LMs) have shown impressive In-Context Learning (ICL) ability, where the model learns to do an unseen task simply by conditioning on a prompt consisting of input-output examples as demonstration, without any parameter updates. The performance of ICL is highly dominated by the quality of the selected in-context examples. However, previous selection methods are mostly based on simple heuristics, leading to sub-optimal performance. In this work, we systematically formulate in-context example selection as a subset selection problem, and optimize it in an end-to-end fashion. We propose CEIL (Compositional Exemplars for In-context Learning), which is instantiated by Determinantal Point Processes (DPPs) to model the interaction between the given input and in-context examples, and optimized through carefully-designed contrastive learning to obtain preference from LMs. We validate CEIL on 12 classification and generation datasets from 7 distinct NLP tasks, including sentiment analysis, phraphrase detection, natural language inference, commonsense reasoning, open-domain question answering, code generation and semantic parsing. Extensive experiments demonstrate the effectiveness, transferability, compositionality of CEIL, shedding new lights on in-context leaning. Our code is released at https://github.com/HKUNLP/icl-ceil.
Cite
Text
Ye et al. "Compositional Exemplars for In-Context Learning." International Conference on Machine Learning, 2023.Markdown
[Ye et al. "Compositional Exemplars for In-Context Learning." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/ye2023icml-compositional/)BibTeX
@inproceedings{ye2023icml-compositional,
title = {{Compositional Exemplars for In-Context Learning}},
author = {Ye, Jiacheng and Wu, Zhiyong and Feng, Jiangtao and Yu, Tao and Kong, Lingpeng},
booktitle = {International Conference on Machine Learning},
year = {2023},
pages = {39818-39833},
volume = {202},
url = {https://mlanthology.org/icml/2023/ye2023icml-compositional/}
}