Masked Generative Priors Improve World Models Sequence Modelling Capabilities
Abstract
Deep Reinforcement Learning (RL) has become the leading approach for creating artificial agents in complex environments. Model-based approaches, which are RL methods with world models that predict environment dynamics, are among the most promising directions for improving data efficiency, forming a critical step toward bridging the gap between research and real-world deployment. In particular, world models enhance sample efficiency by learning in imagination, which involves training a generative sequence model of the environment in a self-supervised manner. Recently, Masked Generative Modelling has emerged as a more efficient and superior inductive bias for modelling and generating token sequences. Building on the Efficient Stochastic Transformer-based World Models (STORM) architecture, we replace the traditional MLP prior with a Masked Generative Prior (e.g., MaskGIT Prior) and introduce GIT-STORM. We evaluate our model on two downstream tasks: reinforcement learning and video prediction. GIT-STORM demonstrates substantial performance gains in RL tasks on the Atari 100k benchmark. Moreover, we apply Categorical Transformer-based World Models to continuous action environments for the first time, addressing a significant gap in prior research. To achieve this, we employ a state mixer function that integrates latent state representations with actions, enabling our model to handle continuous control tasks. We validate this approach through qualitative and quantitative analyses on the DeepMind Control Suite, showcasing the effectiveness of Transformer-based World Models in this new domain. Our results highlight the versatility and efficacy of the MaskGIT dynamics prior, paving the way for more accurate world models and effective RL policies.
Cite
Text
Meo et al. "Masked Generative Priors Improve World Models Sequence Modelling Capabilities." ICLR 2025 Workshops: World_Models, 2025.Markdown
[Meo et al. "Masked Generative Priors Improve World Models Sequence Modelling Capabilities." ICLR 2025 Workshops: World_Models, 2025.](https://mlanthology.org/iclrw/2025/meo2025iclrw-masked/)BibTeX
@inproceedings{meo2025iclrw-masked,
title = {{Masked Generative Priors Improve World Models Sequence Modelling Capabilities}},
author = {Meo, Cristian and Lică, Mircea Tudor and Ikram, Zarif and Nakano, Akihiro and Shah, Vedant and Didolkar, Aniket Rajiv and Liu, Dianbo and Goyal, Anirudh and Dauwels, Justin},
booktitle = {ICLR 2025 Workshops: World_Models},
year = {2025},
url = {https://mlanthology.org/iclrw/2025/meo2025iclrw-masked/}
}