Upright and Stabilized Omnidirectional Depth Estimation for Wide-Baseline Multi-Camera Inertial Systems

Abstract

This paper presents an upright and stabilized omnidirectional depth estimation for an arbitrarily rotated wide- baseline multi-camera inertial system. By aligning the reference rig coordinate system with the gravity direction acquired from an inertial measurement unit, we sample depth hypotheses for omnidirectional stereo matching by sweeping global spheres whose equators are parallel to the ground plane. Then, unary features extracted from each input image by 2D convolutional neural networks (CNN) are warped onto the swept spheres, and the final omnidirectional depth map is output through cost computation by a 3D CNN-based hourglass module and a softargmax operation. This can eliminate wavy or unrecognizable visual artifacts in equirectangular depth maps which can cause failures in scene understanding. We show the capability of our upright and stabilized omnidirectional depth estimation through experiments on real data.

Cite

Text

Won et al. "Upright and Stabilized Omnidirectional Depth Estimation for Wide-Baseline Multi-Camera Inertial Systems." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2020. doi:10.1109/CVPRW50498.2020.00324

Markdown

[Won et al. "Upright and Stabilized Omnidirectional Depth Estimation for Wide-Baseline Multi-Camera Inertial Systems." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2020.](https://mlanthology.org/cvprw/2020/won2020cvprw-upright/) doi:10.1109/CVPRW50498.2020.00324

BibTeX

@inproceedings{won2020cvprw-upright,
  title     = {{Upright and Stabilized Omnidirectional Depth Estimation for Wide-Baseline Multi-Camera Inertial Systems}},
  author    = {Won, Changhee and Seok, Hochang and Lim, Jongwoo},
  booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops},
  year      = {2020},
  pages     = {2689-2692},
  doi       = {10.1109/CVPRW50498.2020.00324},
  url       = {https://mlanthology.org/cvprw/2020/won2020cvprw-upright/}
}