Staple: Complementary Learners for Real-Time Tracking

Abstract

Correlation Filter-based trackers have recently achieved excellent performance, showing great robustness to challenging situations exhibiting motion blur and illumination changes. However, since the model that they learn depends strongly on the spatial layout of the tracked object, they are notoriously sensitive to deformation. Models based on colour statistics have complementary traits: they cope well with variation in shape, but suffer when illumination is not consistent throughout a sequence. Moreover, colour distributions alone can be insufficiently discriminative. In this paper, we show that a simple tracker combining complementary cues in a ridge regression framework can operate faster than 80 FPS and outperform not only all entries in the popular VOT14 competition, but also recent and far more sophisticated trackers according to multiple benchmarks.

Cite

Text

Bertinetto et al. "Staple: Complementary Learners for Real-Time Tracking." Conference on Computer Vision and Pattern Recognition, 2016. doi:10.1109/CVPR.2016.156

Markdown

[Bertinetto et al. "Staple: Complementary Learners for Real-Time Tracking." Conference on Computer Vision and Pattern Recognition, 2016.](https://mlanthology.org/cvpr/2016/bertinetto2016cvpr-staple/) doi:10.1109/CVPR.2016.156

BibTeX

@inproceedings{bertinetto2016cvpr-staple,
  title     = {{Staple: Complementary Learners for Real-Time Tracking}},
  author    = {Bertinetto, Luca and Valmadre, Jack and Golodetz, Stuart and Miksik, Ondrej and Torr, Philip H. S.},
  booktitle = {Conference on Computer Vision and Pattern Recognition},
  year      = {2016},
  doi       = {10.1109/CVPR.2016.156},
  url       = {https://mlanthology.org/cvpr/2016/bertinetto2016cvpr-staple/}
}