Improving Multi-Frame Data Association with Sparse Representations for Robust Near-Online Multi-Object Tracking

Abstract

Multiple Object Tracking still remains a difficult problem due to appearance variations and occlusions of the targets or detection failures. Using sophisticated appearance models or performing data association over multiple frames are two common approaches that lead to gain in performances. Inspired by the success of sparse representations in Single Object Tracking, we propose to formulate the multi-frame data association step as an energy minimization problem, designing an energy that efficiently exploits sparse representations of all detections. Furthermore, we propose to use a structured sparsity-inducing norm to compute representations more suited to the tracking context. We perform extensive experiments to demonstrate the effectiveness of the proposed formulation, and evaluate our approach on two public authoritative benchmarks in order to compare it with several state-of-the-art methods.

Cite

Text

Fagot-Bouquet et al. "Improving Multi-Frame Data Association with Sparse Representations for Robust Near-Online Multi-Object Tracking." European Conference on Computer Vision, 2016. doi:10.1007/978-3-319-46484-8_47

Markdown

[Fagot-Bouquet et al. "Improving Multi-Frame Data Association with Sparse Representations for Robust Near-Online Multi-Object Tracking." European Conference on Computer Vision, 2016.](https://mlanthology.org/eccv/2016/fagotbouquet2016eccv-improving/) doi:10.1007/978-3-319-46484-8_47

BibTeX

@inproceedings{fagotbouquet2016eccv-improving,
  title     = {{Improving Multi-Frame Data Association with Sparse Representations for Robust Near-Online Multi-Object Tracking}},
  author    = {Fagot-Bouquet, Loïc and Audigier, Romaric and Dhome, Yoann and Lerasle, Frédéric},
  booktitle = {European Conference on Computer Vision},
  year      = {2016},
  pages     = {774-790},
  doi       = {10.1007/978-3-319-46484-8_47},
  url       = {https://mlanthology.org/eccv/2016/fagotbouquet2016eccv-improving/}
}