Video Interpolation by Event-Driven Anisotropic Adjustment of Optical Flow

Abstract

Video frame interpolation is a challenging task due to the ever-changing real-world scene. Previous methods often calculate the bi-directional optical flows and then predict the intermediate optical flows under the linear motion assumptions, leading to isotropic intermediate flow generation. Follow-up research obtained anisotropic adjustment through estimated higher-order motion information with extra frames. Based on the motion assumptions, their methods are hard to model the complicated motion in real scenes. In this paper, we propose an end-to-end training method A^2OF for video frame interpolation with event-driven Anisotropic Adjustment of Optical Flows. Specifically, we use events to generate optical flow distribution masks for the intermediate optical flow, which can model the complicated motion between two frames. Our proposed method outperforms the previous methods in video frame interpolation, taking supervised event-based video interpolation to a higher stage.

Cite

Text

Wu et al. "Video Interpolation by Event-Driven Anisotropic Adjustment of Optical Flow." Proceedings of the European Conference on Computer Vision (ECCV), 2022. doi:10.1007/978-3-031-20071-7_16

Markdown

[Wu et al. "Video Interpolation by Event-Driven Anisotropic Adjustment of Optical Flow." Proceedings of the European Conference on Computer Vision (ECCV), 2022.](https://mlanthology.org/eccv/2022/wu2022eccv-video/) doi:10.1007/978-3-031-20071-7_16

BibTeX

@inproceedings{wu2022eccv-video,
  title     = {{Video Interpolation by Event-Driven Anisotropic Adjustment of Optical Flow}},
  author    = {Wu, Song and You, Kaichao and He, Weihua and Yang, Chen and Tian, Yang and Wang, Yaoyuan and Zhang, Ziyang and Liao, Jianxing},
  booktitle = {Proceedings of the European Conference on Computer Vision (ECCV)},
  year      = {2022},
  doi       = {10.1007/978-3-031-20071-7_16},
  url       = {https://mlanthology.org/eccv/2022/wu2022eccv-video/}
}