Tracking from Multiple View Points: Self-Calibration of Space and Time

Abstract

This paper tackles the problem of self-calibration of multiple cameras which are very far apart. Given a set of feature correspondences one can determine the camera geometry. The key problem we address is finding such correspondences. Since the camera geometry (location and orientation) and photometric characteristics vary considerably between images one cannot use brightness and/or proximity constraints. Instead we propose a three step approach: first we use moving objects in the scene to determine a rough planar alignment, next we use static features to improve the alignment, finally we compute the epipolar geometry from the the homography matrix of the planar alignment. We do not assume synchronized cameras and we show that enforcing geometric constraints enables us to align the tracking data in time. We present results on challenging outdoor scenes using real time tracking data.

Cite

Text

Stein. "Tracking from Multiple View Points: Self-Calibration of Space and Time." IEEE/CVF Conference on Computer Vision and Pattern Recognition, 1999. doi:10.1109/CVPR.1999.786987

Markdown

[Stein. "Tracking from Multiple View Points: Self-Calibration of Space and Time." IEEE/CVF Conference on Computer Vision and Pattern Recognition, 1999.](https://mlanthology.org/cvpr/1999/stein1999cvpr-tracking/) doi:10.1109/CVPR.1999.786987

BibTeX

@inproceedings{stein1999cvpr-tracking,
  title     = {{Tracking from Multiple View Points: Self-Calibration of Space and Time}},
  author    = {Stein, Gideon P.},
  booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  year      = {1999},
  pages     = {1521-1527},
  doi       = {10.1109/CVPR.1999.786987},
  url       = {https://mlanthology.org/cvpr/1999/stein1999cvpr-tracking/}
}