Integrating Appearance and Spatial-Temporal Information for Multi-Camera People Tracking
Abstract
Multi-Camera People Tracking (MCPT) is a crucial task in intelligent surveillance systems. However, it presents significant challenges due to issues such as heavy occlusion and variations in appearance that arise from multiple camera perspectives and congested scenarios. In this paper, we propose an effective system that integrates both appearance and spatial-temporal information to address these problems, consisting of three specially designed modules: (1) A Multi-Object Tracking (MOT) method that minimizes ID-switch errors and generates accurate trajectory appearance features for MCPT. (2) A robust intra-camera association method that leverages both appearance and spatial-temporal information. (3) An effective post-processing module comprising multi-step processing. Our proposed system is evaluated on the test set of Track1 for the 2023 AI CITY CHALLENGE, and the experimental results demonstrate its effectiveness, achieving an IDF1 score of 93.31% and ranking 3rd on the leaderboard.
Cite
Text
Yang et al. "Integrating Appearance and Spatial-Temporal Information for Multi-Camera People Tracking." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2023. doi:10.1109/CVPRW59228.2023.00554Markdown
[Yang et al. "Integrating Appearance and Spatial-Temporal Information for Multi-Camera People Tracking." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2023.](https://mlanthology.org/cvprw/2023/yang2023cvprw-integrating/) doi:10.1109/CVPRW59228.2023.00554BibTeX
@inproceedings{yang2023cvprw-integrating,
title = {{Integrating Appearance and Spatial-Temporal Information for Multi-Camera People Tracking}},
author = {Yang, Wenjie and Xie, Zhenyu and Wang, Yaoming and Zhang, Yang and Ma, Xiao and Hao, Bing},
booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops},
year = {2023},
pages = {5260-5269},
doi = {10.1109/CVPRW59228.2023.00554},
url = {https://mlanthology.org/cvprw/2023/yang2023cvprw-integrating/}
}