Aligning Human Motion Generation with Human Perceptions

Abstract

Human motion generation is a critical task with a wide spectrum of applications. Achieving high realism in generated motions requires naturalness, smoothness, and plausibility. However, current evaluation metrics often rely on simple heuristics or distribution distances and do not align well with human perceptions. In this work, we propose a data-driven approach to bridge this gap by introducing a large-scale human perceptual evaluation dataset, MotionPercept, and a human motion critic model, MotionCritic, that capture human perceptual preferences. Our critic model offers a more accurate metric for assessing motion quality and could be readily integrated into the motion generation pipeline to enhance generation quality. Extensive experiments demonstrate the effectiveness of our approach in both evaluating and improving the quality of generated human motions by aligning with human perceptions. Code and data are publicly available at https://motioncritic.github.io/.

Cite

Text

Wang et al. "Aligning Human Motion Generation with Human Perceptions." International Conference on Learning Representations, 2025.

Markdown

[Wang et al. "Aligning Human Motion Generation with Human Perceptions." International Conference on Learning Representations, 2025.](https://mlanthology.org/iclr/2025/wang2025iclr-aligning/)

BibTeX

@inproceedings{wang2025iclr-aligning,
  title     = {{Aligning Human Motion Generation with Human Perceptions}},
  author    = {Wang, Haoru and Zhu, Wentao and Miao, Luyi and Xu, Yishu and Gao, Feng and Tian, Qi and Wang, Yizhou},
  booktitle = {International Conference on Learning Representations},
  year      = {2025},
  url       = {https://mlanthology.org/iclr/2025/wang2025iclr-aligning/}
}