MGF: Mixed Gaussian Flow for Diverse Trajectory Prediction

Abstract

To predict future trajectories, the normalizing flow with a standard Gaussian prior suffers from weak diversity. The ineffectiveness comes from the conflict between the fact of asymmetric and multi-modal distribution of likely outcomes and symmetric and single-modal original distribution and supervision losses.Instead, we propose constructing a mixed Gaussian prior for a normalizing flow model for trajectory prediction.The prior is constructed by analyzing the trajectory patterns in the training samples without requiring extra annotations while showing better expressiveness and being multi-modal and asymmetric.Besides diversity, it also provides better controllability for probabilistic trajectory generation.We name our method Mixed Gaussian Flow (MGF). It achieves state-of-the-art performance in the evaluation of both trajectory alignment and diversity on the popular UCY/ETH and SDD datasets. Code is available at https://github.com/mulplue/MGF.

Cite

Text

Chen et al. "MGF: Mixed Gaussian Flow for Diverse Trajectory Prediction." Neural Information Processing Systems, 2024. doi:10.52202/079017-1834

Markdown

[Chen et al. "MGF: Mixed Gaussian Flow for Diverse Trajectory Prediction." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/chen2024neurips-mgf/) doi:10.52202/079017-1834

BibTeX

@inproceedings{chen2024neurips-mgf,
  title     = {{MGF: Mixed Gaussian Flow for Diverse Trajectory Prediction}},
  author    = {Chen, Jiahe and Cao, Jinkun and Lin, Dahua and Kitani, Kris and Pang, Jiangmiao},
  booktitle = {Neural Information Processing Systems},
  year      = {2024},
  doi       = {10.52202/079017-1834},
  url       = {https://mlanthology.org/neurips/2024/chen2024neurips-mgf/}
}