Unsupervised Metric Relocalization Using Transform Consistency Loss

Abstract

Training networks to perform metric relocalization traditionally requires accurate image correspondences. In practice, these are obtained by restricting domain coverage, employing additional sensors, or capturing large multi-view datasets. We instead propose a self-supervised solution, which exploits a key insight: localizing a query image within a map should yield the same absolute pose, regardless of the reference image used for registration. Guided by this intuition, we derive a novel transform consistency loss. Using this loss function, we train a deep neural network to infer dense feature and saliency maps to perform robust metric relocalization in dynamic environments. We evaluate our framework on synthetic and real-world data, showing our approach outperforms other supervised methods when a limited amount of ground-truth information is available.

Cite

Text

Kasper et al. "Unsupervised Metric Relocalization Using Transform Consistency Loss." Conference on Robot Learning, 2020.

Markdown

[Kasper et al. "Unsupervised Metric Relocalization Using Transform Consistency Loss." Conference on Robot Learning, 2020.](https://mlanthology.org/corl/2020/kasper2020corl-unsupervised/)

BibTeX

@inproceedings{kasper2020corl-unsupervised,
  title     = {{Unsupervised Metric Relocalization Using Transform Consistency Loss}},
  author    = {Kasper, Mike and Nobre, Fernando and Heckman, Christoffer and Keivan, Nima},
  booktitle = {Conference on Robot Learning},
  year      = {2020},
  pages     = {1736-1745},
  volume    = {155},
  url       = {https://mlanthology.org/corl/2020/kasper2020corl-unsupervised/}
}