ParaFormer: Parallel Attention Transformer for Efficient Feature Matching

Abstract

Heavy computation is a bottleneck limiting deep-learning-based feature matching algorithms to be applied in many real-time applications. However, existing lightweight networks optimized for Euclidean data cannot address classical feature matching tasks, since sparse keypoint based descriptors are expected to be matched. This paper tackles this problem and proposes two concepts: 1) a novel parallel attention model entitled ParaFormer and 2) a graph based U-Net architecture with attentional pooling. First, ParaFormer fuses features and keypoint positions through the concept of amplitude and phase, and integrates self- and cross-attention in a parallel manner which achieves a win-win performance in terms of accuracy and efficiency. Second, with U-Net architecture and proposed attentional pooling, the ParaFormer-U variant significantly reduces computational complexity, and minimize performance loss caused by downsampling. Sufficient experiments on various applications, including homography estimation, pose estimation, and image matching, demonstrate that ParaFormer achieves state-of-the-art performance while maintaining high efficiency. The efficient ParaFormer-U variant achieves comparable performance with less than 50% FLOPs of the existing attention-based models.

Cite

Text

Lu et al. "ParaFormer: Parallel Attention Transformer for Efficient Feature Matching." AAAI Conference on Artificial Intelligence, 2023. doi:10.1609/AAAI.V37I2.25275

Markdown

[Lu et al. "ParaFormer: Parallel Attention Transformer for Efficient Feature Matching." AAAI Conference on Artificial Intelligence, 2023.](https://mlanthology.org/aaai/2023/lu2023aaai-paraformer/) doi:10.1609/AAAI.V37I2.25275

BibTeX

@inproceedings{lu2023aaai-paraformer,
  title     = {{ParaFormer: Parallel Attention Transformer for Efficient Feature Matching}},
  author    = {Lu, Xiaoyong and Yan, Yaping and Kang, Bin and Du, Songlin},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2023},
  pages     = {1853-1860},
  doi       = {10.1609/AAAI.V37I2.25275},
  url       = {https://mlanthology.org/aaai/2023/lu2023aaai-paraformer/}
}