Automatic Play Segmentation of Hockey Videos
Abstract
Most team sports such as hockey involve periods of active play interleaved with breaks in play. When watching a game remotely, many fans would prefer an abbreviated game showing only periods of active play. Here we address the problem of identifying these periods in order to produce a time-compressed viewing experience. Our approach is based on a hidden Markov model of play state driven by deep visual and optional auditory cues. We find that our deep visual cues generalize well across different cameras and that auditory cues can improve performance but only if unsupervised methods are used to adapt emission distributions to domain shift across games. Our system achieves temporal compression rates of 20-50% at a recall of 96%.
Cite
Text
Pidaparthy et al. "Automatic Play Segmentation of Hockey Videos." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2021. doi:10.1109/CVPRW53098.2021.00516Markdown
[Pidaparthy et al. "Automatic Play Segmentation of Hockey Videos." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2021.](https://mlanthology.org/cvprw/2021/pidaparthy2021cvprw-automatic/) doi:10.1109/CVPRW53098.2021.00516BibTeX
@inproceedings{pidaparthy2021cvprw-automatic,
title = {{Automatic Play Segmentation of Hockey Videos}},
author = {Pidaparthy, Hemanth and Dowling, Michael H. and Elder, James H.},
booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops},
year = {2021},
pages = {4585-4593},
doi = {10.1109/CVPRW53098.2021.00516},
url = {https://mlanthology.org/cvprw/2021/pidaparthy2021cvprw-automatic/}
}