RetroMoE: A Mixture-of-Experts Latent Translation Framework for Single-Step Retrosynthesis
Abstract
Single-step retrosynthesis is a crucial task in organic synthesis, where the objective is to identify the reactants needed to produce a given product. In recent years, a variety of machine learning methods have been developed to tackle retrosynthesis prediction. In our study, we introduce RetroMoE, a novel generative model designed for the single-step retrosynthesis task. We start with a non-symmetric variational autoencoder (VAE) that incorporates a graph encoder to map molecular graphs into a latent space, followed by a transformer decoder for precise prediction of molecular SMILES strings. Additionally, we implement a simple yet effective mixture-of-experts (MoE) network to translate the product latent embedding into the reactant latent embedding. To our knowledge, this is the first approach that frames single-step retrosynthesis as a latent translation problem. Extensive experiments on the USPTO-50K and USPTO-MIT datasets demonstrate the superiority of our method, which not only surpasses most semi-template-based and template-free methods but also delivers competitive results against template-based methods. Notably, under the class-known setting on the USPTO-50K, our method achieves top-1 exact match accuracy comparable to the state-of-the-art template method, RetroKNN.
Cite
Text
Li and Verma. "RetroMoE: A Mixture-of-Experts Latent Translation Framework for Single-Step Retrosynthesis." International Joint Conference on Artificial Intelligence, 2025. doi:10.24963/IJCAI.2025/835Markdown
[Li and Verma. "RetroMoE: A Mixture-of-Experts Latent Translation Framework for Single-Step Retrosynthesis." International Joint Conference on Artificial Intelligence, 2025.](https://mlanthology.org/ijcai/2025/li2025ijcai-retromoe/) doi:10.24963/IJCAI.2025/835BibTeX
@inproceedings{li2025ijcai-retromoe,
title = {{RetroMoE: A Mixture-of-Experts Latent Translation Framework for Single-Step Retrosynthesis}},
author = {Li, Xinjie and Verma, Abhinav},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2025},
pages = {7509-7517},
doi = {10.24963/IJCAI.2025/835},
url = {https://mlanthology.org/ijcai/2025/li2025ijcai-retromoe/}
}