Deep Adaptive Fusion Network for High Performance RGBT Tracking

Abstract

Due to the complementarity of RGB and thermal data, RGBT tracking has received more and more attention in recent years because it can effectively solve the degradation of tracking performance in dark environments and bad weather conditions. How to effectively fuse the information from RGB and thermal modality is the key to give full play to their complementarities for effective RGBT tracking. In this paper, we propose a high performance RGBT tracking framework based on a novel deep adaptive fusion network, named DAFNet. Our DAFNet consists of a recursive fusion chain that could adaptively integrate all layer features in an end-to-end manner. Due to simple yet effective operations in DAFNet, our tracker is able to reach the near-real-time speed. Comparing with the state-of-the-art trackers on two public datasets, our DAFNet tracker achieves the outstanding performance and yields a new state-of-the-art in RGBT tracking.

Cite

Text

Gao et al. "Deep Adaptive Fusion Network for High Performance RGBT Tracking." IEEE/CVF International Conference on Computer Vision Workshops, 2019. doi:10.1109/ICCVW.2019.00017

Markdown

[Gao et al. "Deep Adaptive Fusion Network for High Performance RGBT Tracking." IEEE/CVF International Conference on Computer Vision Workshops, 2019.](https://mlanthology.org/iccvw/2019/gao2019iccvw-deep/) doi:10.1109/ICCVW.2019.00017

BibTeX

@inproceedings{gao2019iccvw-deep,
  title     = {{Deep Adaptive Fusion Network for High Performance RGBT Tracking}},
  author    = {Gao, Yuan and Li, Chenglong and Zhu, Yabin and Tang, Jin and He, Tao and Wang, Futian},
  booktitle = {IEEE/CVF International Conference on Computer Vision Workshops},
  year      = {2019},
  pages     = {91-99},
  doi       = {10.1109/ICCVW.2019.00017},
  url       = {https://mlanthology.org/iccvw/2019/gao2019iccvw-deep/}
}