Real-Time Tracking with Stabilized Frame

Abstract

Deep learning methods have dramatically increased tracking accuracy benefitting from exquisite features extractor. Among these methods, siamese-based tracker performs well. However, in case of camera shaking, the objects are easily to be lost because of no consideration of camera judder, and the position of each pixel changes drastically between frames. In particular, the tracking performance would degrade dramatically in case that the target is small and moving fast, such as UAV tracking. In this paper, the S-Siam framework is proposed to deal with this problem and improves the performance of real-time tracking. Through stabilizing each frame by estimating where the object is going to move, the camera is adjusted adaptively to keep the object in its original position. Experimental results on the VOT2018 dataset show that the proposed method obtained an EAO score 0.449, and achieved 10% robustness improvement compared with existing three trackers, i.e., SiamFC, SiamMask and SiamRPN++, which demonstrates the effectiveness of the proposed algorithm.

Cite

Text

Wang et al. "Real-Time Tracking with Stabilized Frame." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2020. doi:10.1109/CVPRW50498.2020.00522

Markdown

[Wang et al. "Real-Time Tracking with Stabilized Frame." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2020.](https://mlanthology.org/cvprw/2020/wang2020cvprw-realtime/) doi:10.1109/CVPRW50498.2020.00522

BibTeX

@inproceedings{wang2020cvprw-realtime,
  title     = {{Real-Time Tracking with Stabilized Frame}},
  author    = {Wang, Zixuan and Zhao, Zhicheng and Su, Fei},
  booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops},
  year      = {2020},
  pages     = {4431-4438},
  doi       = {10.1109/CVPRW50498.2020.00522},
  url       = {https://mlanthology.org/cvprw/2020/wang2020cvprw-realtime/}
}