Synchronization and Calibration of a Camera Network for 3D Event Reconstruction from Live Video

Abstract

We present an approach for automatic reconstruction of a dynamic event using multiple video cameras recording from different viewpoints. Our approach recovers all the necessary information by analyzing the motion of the silhouettes in the multiple video streams. The first step consists of computing the calibration and synchronization for pairs of cameras. We compute the temporal offset and epipolar geometry using an efficient RANSAC-based algorithm to search for the epipoles as well as for robustness. In the next stage the calibration and synchronization for the complete camera network is recovered and then refined through maximum likelihood estimation. Finally, a visual hull algorithm is used to the recover the dynamic shape of the observed object.

Cite

Text

Sinha and Pollefeys. "Synchronization and Calibration of a Camera Network for 3D Event Reconstruction from Live Video." IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2005. doi:10.1109/CVPR.2005.338

Markdown

[Sinha and Pollefeys. "Synchronization and Calibration of a Camera Network for 3D Event Reconstruction from Live Video." IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2005.](https://mlanthology.org/cvpr/2005/sinha2005cvpr-synchronization/) doi:10.1109/CVPR.2005.338

BibTeX

@inproceedings{sinha2005cvpr-synchronization,
  title     = {{Synchronization and Calibration of a Camera Network for 3D Event Reconstruction from Live Video}},
  author    = {Sinha, Sudipta N. and Pollefeys, Marc},
  booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  year      = {2005},
  pages     = {1196},
  doi       = {10.1109/CVPR.2005.338},
  url       = {https://mlanthology.org/cvpr/2005/sinha2005cvpr-synchronization/}
}