EgoTracker: Pedestrian Tracking with Re-Identification in Egocentric Videos
Abstract
We propose and analyze a novel framework for tracking a pedestrian in egocentric videos, which is needed for analyzing social gatherings recorded with a wearable camera. The constant camera and pedestrian movement makes this a challenging problem. The main challenges are natural head movement of wearer and target loss and reappearance in a later frame, due to frequent changes in field of view. By using the optical flow information specific to egocentric videos and also by modifying the learning process and sampling region of trackers which tracks by learning an SVM online, we show that re-identification is possible. The specific trackers chosen are STRUCK and MEEM.
Cite
Text
Nigam and Rameshan. "EgoTracker: Pedestrian Tracking with Re-Identification in Egocentric Videos." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2017. doi:10.1109/CVPRW.2017.134Markdown
[Nigam and Rameshan. "EgoTracker: Pedestrian Tracking with Re-Identification in Egocentric Videos." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2017.](https://mlanthology.org/cvprw/2017/nigam2017cvprw-egotracker/) doi:10.1109/CVPRW.2017.134BibTeX
@inproceedings{nigam2017cvprw-egotracker,
title = {{EgoTracker: Pedestrian Tracking with Re-Identification in Egocentric Videos}},
author = {Nigam, Jyoti and Rameshan, Renu M.},
booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops},
year = {2017},
pages = {980-987},
doi = {10.1109/CVPRW.2017.134},
url = {https://mlanthology.org/cvprw/2017/nigam2017cvprw-egotracker/}
}