Minimax: Efficient Baselines for Autocurricula in JAX

Abstract

Unsupervised environment design (UED) is a form of automatic curriculum learning for training robust decision-making agents to zero-shot transfer into unseen environments. Such autocurricula have received much interest from the RL community. However, UED experiments, based on CPU rollouts and GPU model updates, have often required several weeks of training. This compute requirement is a major obstacle to rapid innovation for the field. This work introduces the minimax library for UED training on accelerated hardware. Using JAX to implement fully-tensorized environments and autocurriculum algorithms, minimax allows the entire training loop to be compiled for hardware acceleration. To provide a petri dish for rapid experimentation, minimax includes a tensorized grid-world based on MiniGrid, in addition to reusable abstractions for conducting autocurricula in procedurally-generated environments. With these components, minimax provides strong UED baselines, including new parallelized variants, which achieve over 120$\times$ speedups in wall time compared to previous implementations when training with equal batch sizes. The minimax library is available under the Apache 2.0 license at https://github.com/facebookresearch/minimax.

Cite

Text

Jiang et al. "Minimax: Efficient Baselines for Autocurricula in JAX." NeurIPS 2023 Workshops: ALOE, 2023.

Markdown

[Jiang et al. "Minimax: Efficient Baselines for Autocurricula in JAX." NeurIPS 2023 Workshops: ALOE, 2023.](https://mlanthology.org/neuripsw/2023/jiang2023neuripsw-minimax/)

BibTeX

@inproceedings{jiang2023neuripsw-minimax,
  title     = {{Minimax: Efficient Baselines for Autocurricula in JAX}},
  author    = {Jiang, Minqi and Dennis, Michael D and Grefenstette, Edward and Rocktäschel, Tim},
  booktitle = {NeurIPS 2023 Workshops: ALOE},
  year      = {2023},
  url       = {https://mlanthology.org/neuripsw/2023/jiang2023neuripsw-minimax/}
}