Acquiring Diverse Skills Using Curriculum Reinforcement Learning with Mixture of Experts

Abstract

Reinforcement learning (RL) is a powerful approach for acquiring a good-performing policy. However, learning diverse skills is challenging in RL due to the commonly used Gaussian policy parameterization. We propose Diverse Skill Learning (Di-SkilL), an RL method for learning diverse skills using Mixture of Experts, where each expert formalizes a skill as a contextual motion primitive. Di-SkilL optimizes each expert and its associate context distribution to a maximum entropy objective that incentivizes learning diverse skills in similar contexts. The per-expert context distribution enables automatic curricula learning, allowing each expert to focus on its best-performing sub-region of the context space. To overcome hard discontinuities and multi-modalities without any prior knowledge of the environment’s unknown context probability space, we leverage energy-based models to represent the per-expert context distributions and demonstrate how we can efficiently train them using the standard policy gradient objective. We show on challenging robot simulation tasks that Di-SkilL can learn diverse and performant skills.

Cite

Text

Celik et al. "Acquiring Diverse Skills Using Curriculum Reinforcement Learning with Mixture of Experts." International Conference on Machine Learning, 2024.

Markdown

[Celik et al. "Acquiring Diverse Skills Using Curriculum Reinforcement Learning with Mixture of Experts." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/celik2024icml-acquiring/)

BibTeX

@inproceedings{celik2024icml-acquiring,
  title     = {{Acquiring Diverse Skills Using Curriculum Reinforcement Learning with Mixture of Experts}},
  author    = {Celik, Onur and Taranovic, Aleksandar and Neumann, Gerhard},
  booktitle = {International Conference on Machine Learning},
  year      = {2024},
  pages     = {5907-5933},
  volume    = {235},
  url       = {https://mlanthology.org/icml/2024/celik2024icml-acquiring/}
}