Robust Egomotion Estimation from Affine Motion Parallax
Abstract
A method of determining the motion of a camera from its image velocities is described that is insensitive to noise and intrinsic camera parameters. This algorithm is based on a novel extension of motion parallax which does not require the instantaneous alignment of features, but uses sparse visual motion estimates to extract the direction of translation of the camera directly, after which determination of the camera rotation and the depths of the image features follows easily. A method for calculating the expected uncertainty in the estimates is also described which allows optimal estimation and can also detect and reject independent motion and false correspondences. Experiments using small perturbation analysis show a favourable comparison with existing methods, and specifically the Fundamental Matrix method.
Cite
Text
Lawn and Cipolla. "Robust Egomotion Estimation from Affine Motion Parallax." European Conference on Computer Vision, 1994. doi:10.1007/3-540-57956-7_24Markdown
[Lawn and Cipolla. "Robust Egomotion Estimation from Affine Motion Parallax." European Conference on Computer Vision, 1994.](https://mlanthology.org/eccv/1994/lawn1994eccv-robust/) doi:10.1007/3-540-57956-7_24BibTeX
@inproceedings{lawn1994eccv-robust,
title = {{Robust Egomotion Estimation from Affine Motion Parallax}},
author = {Lawn, Jonathan M. and Cipolla, Roberto},
booktitle = {European Conference on Computer Vision},
year = {1994},
pages = {205-210},
doi = {10.1007/3-540-57956-7_24},
url = {https://mlanthology.org/eccv/1994/lawn1994eccv-robust/}
}