Transformer Tracking
Abstract
Correlation acts as a critical role in the tracking field, especially in recent popular Siamese-based trackers. The correlation operation is a simple fusion manner to consider the similarity between the template and the search region. However, the correlation operation itself is a local linear matching process, leading to lose semantic information and fall into local optimum easily, which may be the bottleneck of designing high-accuracy tracking algorithms. Is there any better feature fusion method than correlation? To address this issue, inspired by Transformer, this work presents a novel attention-based feature fusion network, which effectively combines the template and search region features solely using attention. Specifically, the proposed method includes an ego-context augment module based on self-attention and a cross-feature augment module based on cross-attention. Finally, we present a Transformer tracking (named TransT) method based on the Siamese-like feature extraction backbone, the designed attention-based fusion mechanism, and the classification and regression head. Experiments show that our TransT achieves very promising results on six challenging datasets, especially on large-scale LaSOT, TrackingNet, and GOT-10k benchmarks. Our tracker runs at approximatively 50 fps on GPU. Code and models are available at https://github.com/chenxin-dlut/TransT.
Cite
Text
Chen et al. "Transformer Tracking." Conference on Computer Vision and Pattern Recognition, 2021. doi:10.1109/CVPR46437.2021.00803Markdown
[Chen et al. "Transformer Tracking." Conference on Computer Vision and Pattern Recognition, 2021.](https://mlanthology.org/cvpr/2021/chen2021cvpr-transformer/) doi:10.1109/CVPR46437.2021.00803BibTeX
@inproceedings{chen2021cvpr-transformer,
title = {{Transformer Tracking}},
author = {Chen, Xin and Yan, Bin and Zhu, Jiawen and Wang, Dong and Yang, Xiaoyun and Lu, Huchuan},
booktitle = {Conference on Computer Vision and Pattern Recognition},
year = {2021},
pages = {8126-8135},
doi = {10.1109/CVPR46437.2021.00803},
url = {https://mlanthology.org/cvpr/2021/chen2021cvpr-transformer/}
}