PathFlow: A Normalizing Flow Generator That Finds Transition Paths

Abstract

Sampling from a Boltzmann distribution to calculate important macro statistics is one of the central tasks in the study of large atomic and molecular systems. Recently, a one-shot configuration sampler, the Boltzmann generator [Noé et al., 2019], is introduced. Though a Boltzmann generator can directly generate independent metastable states, it lacks the ability to find transition pathways and describe the whole transition process. In this paper, we propose PathFlow that can function as a one-shot generator as well as a transition pathfinder. More specifically, a normalizing flow model is constructed to map the base distribution and linear interpolated path in the latent space to the Boltzmann distribution and a minimum (free) energy path in the configuration space simultaneously. PathFlow can be trained by standard gradient-based optimizers using the proposed gradient estimator with a theoretical guarantee. PathFlow, validated with the extensively studied examples including a synthetic Müller potential and Alanine dipeptide, shows a remarkable performance.

Cite

Text

Liu et al. "PathFlow: A Normalizing Flow Generator That Finds Transition Paths." Uncertainty in Artificial Intelligence, 2022.

Markdown

[Liu et al. "PathFlow: A Normalizing Flow Generator That Finds Transition Paths." Uncertainty in Artificial Intelligence, 2022.](https://mlanthology.org/uai/2022/liu2022uai-pathflow/)

BibTeX

@inproceedings{liu2022uai-pathflow,
  title     = {{PathFlow: A Normalizing Flow Generator That Finds Transition Paths}},
  author    = {Liu, Tianyi and Gao, Weihao and Wang, Zhirui and Wang, Chong},
  booktitle = {Uncertainty in Artificial Intelligence},
  year      = {2022},
  pages     = {1232-1242},
  volume    = {180},
  url       = {https://mlanthology.org/uai/2022/liu2022uai-pathflow/}
}