Parametric Object Motion from Blur
Abstract
Motion blur can adversely affect a number of vision tasks, hence it is generally considered a nuisance. We instead treat motion blur as a useful signal that allows to compute the motion of objects from a single image. Drawing on the success of joint segmentation and parametric motion models in the context of optical flow estimation, we propose a parametric object motion model combined with a segmentation mask to exploit localized, non-uniform motion blur. Our parametric image formation model is differentiable w.r.t. the motion parameters, which enables us to generalize marginal-likelihood techniques from uniform blind deblurring to localized, non-uniform blur. A two-stage pipeline, first in derivative space and then in image space, allows to estimate both parametric object motion as well as a motion segmentation from a single image alone. Our experiments demonstrate its ability to cope with very challenging cases of object motion blur.
Cite
Text
Gast et al. "Parametric Object Motion from Blur." Conference on Computer Vision and Pattern Recognition, 2016. doi:10.1109/CVPR.2016.204Markdown
[Gast et al. "Parametric Object Motion from Blur." Conference on Computer Vision and Pattern Recognition, 2016.](https://mlanthology.org/cvpr/2016/gast2016cvpr-parametric/) doi:10.1109/CVPR.2016.204BibTeX
@inproceedings{gast2016cvpr-parametric,
title = {{Parametric Object Motion from Blur}},
author = {Gast, Jochen and Sellent, Anita and Roth, Stefan},
booktitle = {Conference on Computer Vision and Pattern Recognition},
year = {2016},
doi = {10.1109/CVPR.2016.204},
url = {https://mlanthology.org/cvpr/2016/gast2016cvpr-parametric/}
}