Activating Self-Attention for Multi-Scene Absolute Pose Regression
Abstract
Multi-scene absolute pose regression addresses the demand for fast and memory-efficient camera pose estimation across various real-world environments. Nowadays, transformer-based model has been devised to regress the camera pose directly in multi-scenes. Despite its potential, transformer encoders are underutilized due to the collapsed self-attention map, having low representation capacity. This work highlights the problem and investigates it from a new perspective: distortion of query-key embedding space. Based on the statistical analysis, we reveal that queries and keys are mapped in completely different spaces while only a few keys are blended into the query region. This leads to the collapse of the self-attention map as all queries are considered similar to those few keys. Therefore, we propose simple but effective solutions to activate self-attention. Concretely, we present an auxiliary loss that aligns queries and keys, preventing the distortion of query-key space and encouraging the model to find global relations by self-attention. In addition, the fixed sinusoidal positional encoding is adopted instead of undertrained learnable one to reflect appropriate positional clues into the inputs of self-attention. As a result, our approach resolves the aforementioned problem effectively, thus outperforming existing methods in both outdoor and indoor scenes.
Cite
Text
Lee et al. "Activating Self-Attention for Multi-Scene Absolute Pose Regression." Neural Information Processing Systems, 2024. doi:10.52202/079017-1216Markdown
[Lee et al. "Activating Self-Attention for Multi-Scene Absolute Pose Regression." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/lee2024neurips-activating/) doi:10.52202/079017-1216BibTeX
@inproceedings{lee2024neurips-activating,
title = {{Activating Self-Attention for Multi-Scene Absolute Pose Regression}},
author = {Lee, Miso and Kim, Jihwan and Heo, Jae-Pil},
booktitle = {Neural Information Processing Systems},
year = {2024},
doi = {10.52202/079017-1216},
url = {https://mlanthology.org/neurips/2024/lee2024neurips-activating/}
}