WebGazer: Scalable Webcam Eye Tracking Using User Interactions

Abstract

We introduce WebGazer, an online eye tracker that uses common webcams already present in laptops and mobile devices to infer the eye-gaze locations of web visitors on a page in real time. The eye tracking model self-calibrates by watching web visitors interact with the web page and trains a mapping between features of the eye and positions on the screen. This approach aims to provide a natural experience to everyday users that is not restricted to laboratories and highly controlled user studies. WebGazer has two key components: a pupil detector that can be combined with any eye detection library, and a gaze estimator using regression analysis informed by user interactions. We perform a large remote online study and a small in-person study to evaluate WebGazer. The findings show that WebGazer can learn from user interactions and that its accuracy is sufficient for approximating the user's gaze. As part of this paper, we release the first eye tracking library that can be easily integrated in any website for real-time gaze interactions, usability studies, or web research. PDF

Cite

Text

Papoutsaki et al. "WebGazer: Scalable Webcam Eye Tracking Using User Interactions." International Joint Conference on Artificial Intelligence, 2016.

Markdown

[Papoutsaki et al. "WebGazer: Scalable Webcam Eye Tracking Using User Interactions." International Joint Conference on Artificial Intelligence, 2016.](https://mlanthology.org/ijcai/2016/papoutsaki2016ijcai-webgazer/)

BibTeX

@inproceedings{papoutsaki2016ijcai-webgazer,
  title     = {{WebGazer: Scalable Webcam Eye Tracking Using User Interactions}},
  author    = {Papoutsaki, Alexandra and Sangkloy, Patsorn and Laskey, James and Daskalova, Nediyana and Huang, Jeff and Hays, James},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2016},
  pages     = {3839-3845},
  url       = {https://mlanthology.org/ijcai/2016/papoutsaki2016ijcai-webgazer/}
}