Loose Inertial Poser: Motion Capture with IMU-Attached Loose-Wear Jacket
Abstract
Existing wearable motion capture methods typically demand tight on-body fixation (often using straps) for reliable sensing limiting their application in everyday life. In this paper we introduce Loose Inertial Poser a novel motion capture solution with high wearing comfortableness by integrating four Inertial Measurement Units (IMUs) into a loose-wear jacket. Specifically we address the challenge of scarce loose-wear IMU training data by proposing a Secondary Motion AutoEncoder (SeMo-AE) that learns to model and synthesize the effects of secondary motion between the skin and loose clothing on IMU data. SeMo-AE is leveraged to generate a diverse synthetic dataset of loose-wear IMU data to augment training for the pose estimation network and significantly improve its accuracy. For validation we collected a dataset with various subjects and 2 wearing styles (zipped and unzipped). Experimental results demonstrate that our approach maintains high-quality real-time posture estimation even in loose-wear scenarios.
Cite
Text
Zuo et al. "Loose Inertial Poser: Motion Capture with IMU-Attached Loose-Wear Jacket." Conference on Computer Vision and Pattern Recognition, 2024. doi:10.1109/CVPR52733.2024.00215Markdown
[Zuo et al. "Loose Inertial Poser: Motion Capture with IMU-Attached Loose-Wear Jacket." Conference on Computer Vision and Pattern Recognition, 2024.](https://mlanthology.org/cvpr/2024/zuo2024cvpr-loose/) doi:10.1109/CVPR52733.2024.00215BibTeX
@inproceedings{zuo2024cvpr-loose,
title = {{Loose Inertial Poser: Motion Capture with IMU-Attached Loose-Wear Jacket}},
author = {Zuo, Chengxu and Wang, Yiming and Zhan, Lishuang and Guo, Shihui and Yi, Xinyu and Xu, Feng and Qin, Yipeng},
booktitle = {Conference on Computer Vision and Pattern Recognition},
year = {2024},
pages = {2209-2219},
doi = {10.1109/CVPR52733.2024.00215},
url = {https://mlanthology.org/cvpr/2024/zuo2024cvpr-loose/}
}