Iterative Vectors: In-Context Gradient Steering Without Backpropagation
Abstract
In-context learning has become a standard approach for utilizing language models. However, selecting and processing suitable demonstration examples can be challenging and time-consuming, especially when dealing with large numbers of them. We propose Iterative Vectors (IVs), a technique that explores activation space to enhance in-context performance by simulating gradient updates during inference. IVs extract and iteratively refine activation-based meta-gradients, applying them during inference without requiring backpropagation at any stage. We evaluate IVs across various tasks using four popular models and observe significant improvements. Our findings suggest that in-context activation steering is a promising direction, opening new avenues for future research.
Cite
Text
Liu and Deng. "Iterative Vectors: In-Context Gradient Steering Without Backpropagation." Proceedings of the 42nd International Conference on Machine Learning, 2025.Markdown
[Liu and Deng. "Iterative Vectors: In-Context Gradient Steering Without Backpropagation." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/liu2025icml-iterative/)BibTeX
@inproceedings{liu2025icml-iterative,
title = {{Iterative Vectors: In-Context Gradient Steering Without Backpropagation}},
author = {Liu, Yiting and Deng, Zhi-Hong},
booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
year = {2025},
pages = {38290-38312},
volume = {267},
url = {https://mlanthology.org/icml/2025/liu2025icml-iterative/}
}