Self-Calibration from Multiple Views with a Rotating Camera
Abstract
A new practical method is given for the self-calibration of a camera. In this method, at least three images are taken from the same point in space with different orientations of the camera and calibration is computed from an analysis of point matches between the images. The method requires no knowledge of the orientations of the camera. Calibration is based on the image correspondences only. This method differs fundamentally from previous results by Maybank and Faugeras on selfcalibration using the epipolar structure of image pairs. In the method of this paper, there is no epipolar structure since all images are taken from the same point in space. Since the images are all taken from the same point in space, determination of point matches is considerably easier than for images taken with a moving camera, since problems of occlusion or change of aspect or illumination do not occur. The calibration method is evaluated on several sets of synthetic and real image data.
Cite
Text
Hartley. "Self-Calibration from Multiple Views with a Rotating Camera." European Conference on Computer Vision, 1994. doi:10.1007/3-540-57956-7_52Markdown
[Hartley. "Self-Calibration from Multiple Views with a Rotating Camera." European Conference on Computer Vision, 1994.](https://mlanthology.org/eccv/1994/hartley1994eccv-self/) doi:10.1007/3-540-57956-7_52BibTeX
@inproceedings{hartley1994eccv-self,
title = {{Self-Calibration from Multiple Views with a Rotating Camera}},
author = {Hartley, Richard I.},
booktitle = {European Conference on Computer Vision},
year = {1994},
pages = {471-478},
doi = {10.1007/3-540-57956-7_52},
url = {https://mlanthology.org/eccv/1994/hartley1994eccv-self/}
}