Style-ERD: Responsive and Coherent Online Motion Style Transfer

Abstract

Motion style transfer is a common method for enriching character animation. Motion style transfer algorithms are often designed for offline settings where motions are processed in segments. However, for online animation applications, such as real-time avatar animation from motion capture, motions need to be processed as a stream with minimal latency. In this work, we realize a flexible, high-quality motion style transfer method for this setting. We propose a novel style transfer model, Style-ERD, to stylize motions in an online manner with an Encoder-Recurrent-Decoder structure, along with a novel discriminator that combines feature attention and temporal attention. Our method stylizes motions into multiple target styles with a unified model. Although our method targets online settings, it outperforms previous offline methods in motion realism and style expressiveness and provides significant gains in runtime efficiency.

Cite

Text

Tao et al. "Style-ERD: Responsive and Coherent Online Motion Style Transfer." Conference on Computer Vision and Pattern Recognition, 2022. doi:10.1109/CVPR52688.2022.00648

Markdown

[Tao et al. "Style-ERD: Responsive and Coherent Online Motion Style Transfer." Conference on Computer Vision and Pattern Recognition, 2022.](https://mlanthology.org/cvpr/2022/tao2022cvpr-styleerd/) doi:10.1109/CVPR52688.2022.00648

BibTeX

@inproceedings{tao2022cvpr-styleerd,
  title     = {{Style-ERD: Responsive and Coherent Online Motion Style Transfer}},
  author    = {Tao, Tianxin and Zhan, Xiaohang and Chen, Zhongquan and van de Panne, Michiel},
  booktitle = {Conference on Computer Vision and Pattern Recognition},
  year      = {2022},
  pages     = {6593-6603},
  doi       = {10.1109/CVPR52688.2022.00648},
  url       = {https://mlanthology.org/cvpr/2022/tao2022cvpr-styleerd/}
}