Coupling Eye-Motion and Ego-Motion Features for First-Person Activity Recognition
Abstract
We focus on the use of first-person eye movement and ego-motion as a means of understanding and recognizing indoor activities from an "inside-out" camera system. We show that when eye movement captured by an inside looking camera is used in tandem with ego-motion features extracted from an outside looking camera, the classification accuracy of first-person actions can be improved. We also present a dataset of over two hours of realistic indoor desktop actions, including both eye tracking information and a high quality outside camera video. We run experiments and show that our joint feature is effective and robust over multiple users.
Cite
Text
Ogaki et al. "Coupling Eye-Motion and Ego-Motion Features for First-Person Activity Recognition." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2012. doi:10.1109/CVPRW.2012.6239188Markdown
[Ogaki et al. "Coupling Eye-Motion and Ego-Motion Features for First-Person Activity Recognition." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2012.](https://mlanthology.org/cvprw/2012/ogaki2012cvprw-coupling/) doi:10.1109/CVPRW.2012.6239188BibTeX
@inproceedings{ogaki2012cvprw-coupling,
title = {{Coupling Eye-Motion and Ego-Motion Features for First-Person Activity Recognition}},
author = {Ogaki, Keisuke and Kitani, Kris Makoto and Sugano, Yusuke and Sato, Yoichi},
booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops},
year = {2012},
pages = {1-7},
doi = {10.1109/CVPRW.2012.6239188},
url = {https://mlanthology.org/cvprw/2012/ogaki2012cvprw-coupling/}
}