Recurrently Target-Attending Tracking

Abstract

Robust visual tracking is a challenging task in computer vision. Due to the accumulation and propagation of estimation error, model drifting often occurs and degrades the tracking performance. To mitigate this problem, in this paper we propose a novel tracking method called Recurrently Target-attending Tracking (RTT). RTT attempts to identify and exploit those reliable parts which are beneficial for the overall tracking process. To bypass occlusion and discover reliable components, multi-directional Recurrent Neural Networks (RNNs) are employed in RTT to capture long-range contextual cues by traversing a candidate spatial region from multiple directions. The produced confidence maps from the RNNs are employed to adaptively regularize the learning of discriminative correlation filters by suppressing clutter background noises while making full use of the information from reliable parts. To solve the weighted correlation filters, we especially derive an efficient closed-form solution with a sharp reduction in computation complexity. Extensive experiments demonstrate that our proposed RTT is more competitive over those correlation filter based methods.

Cite

Text

Cui et al. "Recurrently Target-Attending Tracking." Conference on Computer Vision and Pattern Recognition, 2016. doi:10.1109/CVPR.2016.161

Markdown

[Cui et al. "Recurrently Target-Attending Tracking." Conference on Computer Vision and Pattern Recognition, 2016.](https://mlanthology.org/cvpr/2016/cui2016cvpr-recurrently/) doi:10.1109/CVPR.2016.161

BibTeX

@inproceedings{cui2016cvpr-recurrently,
  title     = {{Recurrently Target-Attending Tracking}},
  author    = {Cui, Zhen and Xiao, Shengtao and Feng, Jiashi and Yan, Shuicheng},
  booktitle = {Conference on Computer Vision and Pattern Recognition},
  year      = {2016},
  doi       = {10.1109/CVPR.2016.161},
  url       = {https://mlanthology.org/cvpr/2016/cui2016cvpr-recurrently/}
}