Learning Latent Dynamic Robust Representations for World Models

Abstract

Visual Model-Based Reinforcement Learning (MBRL) promises to encapsulate agent’s knowledge about the underlying dynamics of the environment, enabling learning a world model as a useful planner. However, top MBRL agents such as Dreamer often struggle with visual pixel-based inputs in the presence of exogenous or irrelevant noise in the observation space, due to failure to capture task-specific features while filtering out irrelevant spatio-temporal details. To tackle this problem, we apply a spatio-temporal masking strategy, a bisimulation principle, combined with latent reconstruction, to capture endogenous task-specific aspects of the environment for world models, effectively eliminating non-essential information. Joint training of representations, dynamics, and policy often leads to instabilities. To further address this issue, we develop a Hybrid Recurrent State-Space Model (HRSSM) structure, enhancing state representation robustness for effective policy learning. Our empirical evaluation demonstrates significant performance improvements over existing methods in a range of visually complex control tasks such as Maniskill with exogenous distractors from the Matterport environment. Our code is avaliable at https://github.com/bit1029public/HRSSM.

Cite

Text

Sun et al. "Learning Latent Dynamic Robust Representations for World Models." International Conference on Machine Learning, 2024.

Markdown

[Sun et al. "Learning Latent Dynamic Robust Representations for World Models." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/sun2024icml-learning-a/)

BibTeX

@inproceedings{sun2024icml-learning-a,
  title     = {{Learning Latent Dynamic Robust Representations for World Models}},
  author    = {Sun, Ruixiang and Zang, Hongyu and Li, Xin and Islam, Riashat},
  booktitle = {International Conference on Machine Learning},
  year      = {2024},
  pages     = {47234-47260},
  volume    = {235},
  url       = {https://mlanthology.org/icml/2024/sun2024icml-learning-a/}
}