A Statistical Approach to Continuous Self-Calibrating Eye Gaze Tracking for Head-Mounted Virtual Reality Systems

Abstract

We present a novel, automatic eye gaze tracking scheme inspired by smooth pursuit eye motion while playing mobile games or watching virtual reality contents. Our algorithm continuously calibrates an eye tracking system for a head mounted display. This eliminates the need for an explicit calibration step and automatically compensates for small movements of the headset with respect to the head. The algorithm finds correspondences between corneal motion and screen space motion, and uses these to generate Gaussian Process Regression models. A combination of those models provides a continuous mapping from corneal position to screen space position. Accuracy is nearly as good as achieved with an explicit calibration step.

Cite

Text

Tripathi and Guenter. "A Statistical Approach to Continuous Self-Calibrating Eye Gaze Tracking for Head-Mounted Virtual Reality Systems." IEEE/CVF Winter Conference on Applications of Computer Vision, 2017. doi:10.1109/WACV.2017.101

Markdown

[Tripathi and Guenter. "A Statistical Approach to Continuous Self-Calibrating Eye Gaze Tracking for Head-Mounted Virtual Reality Systems." IEEE/CVF Winter Conference on Applications of Computer Vision, 2017.](https://mlanthology.org/wacv/2017/tripathi2017wacv-statistical/) doi:10.1109/WACV.2017.101

BibTeX

@inproceedings{tripathi2017wacv-statistical,
  title     = {{A Statistical Approach to Continuous Self-Calibrating Eye Gaze Tracking for Head-Mounted Virtual Reality Systems}},
  author    = {Tripathi, Subarna and Guenter, Brian},
  booktitle = {IEEE/CVF Winter Conference on Applications of Computer Vision},
  year      = {2017},
  pages     = {862-870},
  doi       = {10.1109/WACV.2017.101},
  url       = {https://mlanthology.org/wacv/2017/tripathi2017wacv-statistical/}
}