H-MoRe: Learning Human-Centric Motion Representation for Action Analysis

Abstract

In this paper, we propose H-MoRe, a novel pipeline for learning precise human-centric motion representation. Our approach dynamically preserves relevant human motion while filtering out background movement. Notably, unlike previous methods relying on fully supervised learning from synthetic data, H-MoRe learns directly from real-world scenarios in a self-supervised manner, incorporating both human pose and body shape information. Inspired by kinematics, H-MoRe represents absolute and relative movements of each body point in a matrix format that captures nuanced motion details, termed world-local flows. H-MoRe offers refined insights into human motion, which can be integrated seamlessly into various action-related applications. Experimental results demonstrate that H-MoRe brings substantial improvements across various downstream tasks, including gait recognition(CL@R1: +16.01%), action recognition(Acc@1: +8.92%), and video generation(FVD: -67.07%). Additionally, H-MoRe exhibits high inference efficiency (34 fps), making it suitable for most real-time scenarios. Models and code will be released upon publication.

Cite

Text

Huang et al. "H-MoRe: Learning Human-Centric Motion Representation for Action Analysis." Conference on Computer Vision and Pattern Recognition, 2025. doi:10.1109/CVPR52734.2025.02114

Markdown

[Huang et al. "H-MoRe: Learning Human-Centric Motion Representation for Action Analysis." Conference on Computer Vision and Pattern Recognition, 2025.](https://mlanthology.org/cvpr/2025/huang2025cvpr-hmore/) doi:10.1109/CVPR52734.2025.02114

BibTeX

@inproceedings{huang2025cvpr-hmore,
  title     = {{H-MoRe: Learning Human-Centric Motion Representation for Action Analysis}},
  author    = {Huang, Zhanbo and Liu, Xiaoming and Kong, Yu},
  booktitle = {Conference on Computer Vision and Pattern Recognition},
  year      = {2025},
  pages     = {22702-22713},
  doi       = {10.1109/CVPR52734.2025.02114},
  url       = {https://mlanthology.org/cvpr/2025/huang2025cvpr-hmore/}
}