Guide Local Feature Matching by Overlap Estimation
Abstract
Local image feature matching under large appearance, viewpoint, and distance changes is challenging yet important. Conventional methods detect and match tentative local features across the whole images, with heuristic consistency checks to guarantee reliable matches. In this paper, we introduce a novel Overlap Estimation method conditioned on image pairs with TRansformer, named OETR, to constrain local feature matching in the commonly visible region. OETR performs overlap estimation in a two step process of feature correlation and then overlap regression. As a preprocessing module, OETR can be plugged into any existing local feature detection and matching pipeline, to mitigate potential view angle or scale variance. Intensive experiments show that OETR can boost state of the art local feature matching performance substantially, especially for image pairs with small shared regions. The code will be publicly available at https://github.com/AbyssGaze/OETR.
Cite
Text
Chen et al. "Guide Local Feature Matching by Overlap Estimation." AAAI Conference on Artificial Intelligence, 2022. doi:10.1609/AAAI.V36I1.19913Markdown
[Chen et al. "Guide Local Feature Matching by Overlap Estimation." AAAI Conference on Artificial Intelligence, 2022.](https://mlanthology.org/aaai/2022/chen2022aaai-guide/) doi:10.1609/AAAI.V36I1.19913BibTeX
@inproceedings{chen2022aaai-guide,
title = {{Guide Local Feature Matching by Overlap Estimation}},
author = {Chen, Ying and Huang, Dihe and Xu, Shang and Liu, Jianlin and Liu, Yong},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2022},
pages = {365-373},
doi = {10.1609/AAAI.V36I1.19913},
url = {https://mlanthology.org/aaai/2022/chen2022aaai-guide/}
}