Efficient Joint Stereo Estimation and Land Usage Classification for Multiview Satellite Data

Abstract

We propose an efficient algorithm to jointly estimate geometry and semantics for a given geographical region observed by multiple satellite images. Our joint estimation leverages an efficient PatchMatch inference framework defined over lattice discretization of the environment. Our cost function relies on the local planarity assumption to model scene geometry and neural network classification to determine semantic (e.g. land use) labels for geometric structures. By utilizing the commonly available direct (i.e. space to image) rational polynomial coefficients (RPC) satellite camera models, our approach effectively circumvents the need for estimating or refining inverse RPC models. Experiments illustrate both the computational efficiency and high quality scene geometry estimates attained by our approach for satellite imagery. To further illustrate the generality of our representation and inference framework, experiments on standard benchmarks for ground-level imagery are also included.

Cite

Text

Wang et al. "Efficient Joint Stereo Estimation and Land Usage Classification for Multiview Satellite Data." IEEE/CVF Winter Conference on Applications of Computer Vision, 2016. doi:10.1109/WACV.2016.7477657

Markdown

[Wang et al. "Efficient Joint Stereo Estimation and Land Usage Classification for Multiview Satellite Data." IEEE/CVF Winter Conference on Applications of Computer Vision, 2016.](https://mlanthology.org/wacv/2016/wang2016wacv-efficient/) doi:10.1109/WACV.2016.7477657

BibTeX

@inproceedings{wang2016wacv-efficient,
  title     = {{Efficient Joint Stereo Estimation and Land Usage Classification for Multiview Satellite Data}},
  author    = {Wang, Ke and Stutts, Craig and Dunn, Enrique and Frahm, Jan-Michael},
  booktitle = {IEEE/CVF Winter Conference on Applications of Computer Vision},
  year      = {2016},
  pages     = {1-9},
  doi       = {10.1109/WACV.2016.7477657},
  url       = {https://mlanthology.org/wacv/2016/wang2016wacv-efficient/}
}