Remote and Head-Motion-Free Gaze Tracking for Real Environments with Automated Head-Eye Model Calibrations
Abstract
We propose a gaze estimation method that substantially relaxes the practical constraints possessed by most conventional methods. Gaze estimation research has a long history, and many systems including some commercial schemes have been proposed. However, the application domain of gaze estimation is still limited (e.g, measurement devices for HCI issues, input devices for VDT works) due to the limitations of such systems. First, users must be close to the system (or must wear it) since most systems employ IR illumination and/or stereo cameras. Second, users are required to perform manual calibrations to get geometrically meaningful data. These limitations prevent applications of the system that capture and utilize useful human gaze information in daily situations. In our method, inspired by a bundled adjustment framework, the parameters of the 3D head-eye model are robustly estimated by minimizing pixel-wise re-projection errors between single-camera input images and eye model projections for multiple frames with adjacently estimated head poses. Since this process runs automatically, users does not need to be aware of it. Using the estimated parameters, 3D head poses and gaze directions for newly observed images can be directly determined with the same error minimization manner. This mechanism enables robust gaze estimation with single-camera-based low resolution images without user-aware preparation tasks (i.e., calibration). Experimental results show the proposed method achieves 6° accuracy with QVGA (320 × 240) images. The proposed algorithm is free from observation distances. We confirmed that our system works with longdistance observations (10 meters).
Cite
Text
Yamazoe et al. "Remote and Head-Motion-Free Gaze Tracking for Real Environments with Automated Head-Eye Model Calibrations." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2008. doi:10.1109/CVPRW.2008.4563184Markdown
[Yamazoe et al. "Remote and Head-Motion-Free Gaze Tracking for Real Environments with Automated Head-Eye Model Calibrations." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2008.](https://mlanthology.org/cvprw/2008/yamazoe2008cvprw-remote/) doi:10.1109/CVPRW.2008.4563184BibTeX
@inproceedings{yamazoe2008cvprw-remote,
title = {{Remote and Head-Motion-Free Gaze Tracking for Real Environments with Automated Head-Eye Model Calibrations}},
author = {Yamazoe, Hirotake and Utsumi, Akira and Yonezawa, Tomoko and Abe, Shinji},
booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops},
year = {2008},
pages = {1-6},
doi = {10.1109/CVPRW.2008.4563184},
url = {https://mlanthology.org/cvprw/2008/yamazoe2008cvprw-remote/}
}