Efficient Cross-Episode Meta-RL
Abstract
We introduce Efficient Cross-Episodic Transformers (ECET), a new algorithm for online Meta-Reinforcement Learning that addresses the challenge of enabling reinforcement learning agents to perform effectively in previously unseen tasks. We demonstrate how past episodes serve as a rich source of in-context information, which our model effectively distills and applies to new contexts. Our learned algorithm is capable of outperforming the previous state-of-the-art and provides more efficient meta-training while significantly improving generalization capabilities. Experimental results, obtained across various simulated tasks of the MuJoCo, Meta-World and ManiSkill benchmarks, indicate a significant improvement in learning efficiency and adaptability compared to the state-of-the-art. Our approach enhances the agent's ability to generalize from limited data and paves the way for more robust and versatile AI systems.
Cite
Text
Shala et al. "Efficient Cross-Episode Meta-RL." International Conference on Learning Representations, 2025.Markdown
[Shala et al. "Efficient Cross-Episode Meta-RL." International Conference on Learning Representations, 2025.](https://mlanthology.org/iclr/2025/shala2025iclr-efficient/)BibTeX
@inproceedings{shala2025iclr-efficient,
title = {{Efficient Cross-Episode Meta-RL}},
author = {Shala, Gresa and Biedenkapp, André and Krack, Pierre and Walter, Florian and Grabocka, Josif},
booktitle = {International Conference on Learning Representations},
year = {2025},
url = {https://mlanthology.org/iclr/2025/shala2025iclr-efficient/}
}