Calibration-Free Gaze Sensing Using Saliency Maps

Abstract

We propose a calibration-free gaze sensing method using visual saliency maps. Our goal is to construct a gaze estimator only using eye images captured from a person watching a video clip. The key is treating saliency maps of the video frames as probability distributions of gaze points. To efficiently identify gaze points from saliency maps, we aggregate saliency maps based on the similarity of eye appearances. We establish mapping between eye images to gaze points by Gaussian process regression. The experimental result shows that the proposed method works well with different people and video clips and achieves 6 degrees of accuracy, which is useful for estimating a person's attention on monitors.

Cite

Text

Sugano et al. "Calibration-Free Gaze Sensing Using Saliency Maps." IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2010. doi:10.1109/CVPR.2010.5539984

Markdown

[Sugano et al. "Calibration-Free Gaze Sensing Using Saliency Maps." IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2010.](https://mlanthology.org/cvpr/2010/sugano2010cvpr-calibration/) doi:10.1109/CVPR.2010.5539984

BibTeX

@inproceedings{sugano2010cvpr-calibration,
  title     = {{Calibration-Free Gaze Sensing Using Saliency Maps}},
  author    = {Sugano, Yusuke and Matsushita, Yasuyuki and Sato, Yoichi},
  booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  year      = {2010},
  pages     = {2667-2674},
  doi       = {10.1109/CVPR.2010.5539984},
  url       = {https://mlanthology.org/cvpr/2010/sugano2010cvpr-calibration/}
}