Technical Debt in In-Context Learning: Diminishing Efficiency in Long Context
Abstract
Transformers have demonstrated remarkable in-context learning (ICL) capabilities, adapting to new tasks by simply conditioning on demonstrations without parameter updates. Compelling empirical and theoretical evidence suggests that ICL, as a general-purpose learner, could outperform task-specific models. However, it remains unclear to what extent the transformers optimally learn in-context compared to principled learning algorithms. To investigate this, we employ a meta ICL framework in which each prompt defines a distinctive regression task whose target function is drawn from a hierarchical distribution, requiring inference over both the latent model class and task-specific parameters. Within this setup, we benchmark sample complexity of ICL against principled learning algorithms, including the Bayes optimal estimator, under varying performance requirements. Our findings reveal a striking dichotomy: while ICL initially matches the efficiency of a Bayes optimal estimator, its efficiency significantly deteriorates in long context. Through an information-theoretic analysis, we show that the diminishing efficiency is inherent to ICL. These results clarify the trade-offs in adopting ICL as a universal problem solver, motivating a new generation of on-the-fly adaptive methods without the diminishing efficiency.
Cite
Text
Joo and Klabjan. "Technical Debt in In-Context Learning: Diminishing Efficiency in Long Context." Advances in Neural Information Processing Systems, 2025.Markdown
[Joo and Klabjan. "Technical Debt in In-Context Learning: Diminishing Efficiency in Long Context." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/joo2025neurips-technical/)BibTeX
@inproceedings{joo2025neurips-technical,
title = {{Technical Debt in In-Context Learning: Diminishing Efficiency in Long Context}},
author = {Joo, Taejong and Klabjan, Diego},
booktitle = {Advances in Neural Information Processing Systems},
year = {2025},
url = {https://mlanthology.org/neurips/2025/joo2025neurips-technical/}
}