Minimum Latency Deep Online Video Stabilization

Abstract

We present a novel camera path optimization framework for the task of online video stabilization. Typically, a stabilization pipeline consists of three steps: motion estimating, path smoothing, and novel view rendering. Most previous methods concentrate on motion estimation, proposing various global or local motion models. In contrast, path optimization receives relatively less attention, especially in the important online setting, where no future frames are available. In this work, we adopt recent off-the-shelf high-quality deep motion models for motion estimation to recover the camera trajectory and focus on the latter two steps. Our network takes a short 2D camera path in a sliding window as input and outputs the stabilizing warp field of the last frame in the window, which warps the coming frame to its stabilized position. A hybrid loss is well-defined to constrain the spatial and temporal consistency. In addition, we build a motion dataset that contains stable and unstable motion pairs for the training. Extensive experiments demonstrate that our approach significantly outperforms state-of-the-art online methods both qualitatively and quantitatively and achieves comparable performance to offline methods.

Cite

Text

Zhang et al. "Minimum Latency Deep Online Video Stabilization." International Conference on Computer Vision, 2023. doi:10.1109/ICCV51070.2023.02105

Markdown

[Zhang et al. "Minimum Latency Deep Online Video Stabilization." International Conference on Computer Vision, 2023.](https://mlanthology.org/iccv/2023/zhang2023iccv-minimum/) doi:10.1109/ICCV51070.2023.02105

BibTeX

@inproceedings{zhang2023iccv-minimum,
  title     = {{Minimum Latency Deep Online Video Stabilization}},
  author    = {Zhang, Zhuofan and Liu, Zhen and Tan, Ping and Zeng, Bing and Liu, Shuaicheng},
  booktitle = {International Conference on Computer Vision},
  year      = {2023},
  pages     = {23030-23039},
  doi       = {10.1109/ICCV51070.2023.02105},
  url       = {https://mlanthology.org/iccv/2023/zhang2023iccv-minimum/}
}