Robustifying Eye Interaction

Abstract

This paper presents a gaze typing system based on consumer hardware. Eye tracking based on consumer hardware is subject to several unknown factors. We propose methods using robust statistical principles to accommodate uncertainties in image data as well as in gaze estimates to improve accuracy. We have succeeded to track the gaze of people with a standard consumer camera, obtaining accuracies about 160 pixels on screen. Proper design of the typing interface, however, reduces the need for high accuracy. We have observed typing speeds in the range of 3 - 5 words per minute for untrained subjects using large on-screen buttons and a new noise tolerant dwell-time principle

Cite

Text

Hansen and Hansen. "Robustifying Eye Interaction." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2006. doi:10.1109/CVPRW.2006.181

Markdown

[Hansen and Hansen. "Robustifying Eye Interaction." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2006.](https://mlanthology.org/cvprw/2006/hansen2006cvprw-robustifying/) doi:10.1109/CVPRW.2006.181

BibTeX

@inproceedings{hansen2006cvprw-robustifying,
  title     = {{Robustifying Eye Interaction}},
  author    = {Hansen, Dan Witzner and Hansen, John Paulin},
  booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops},
  year      = {2006},
  pages     = {152},
  doi       = {10.1109/CVPRW.2006.181},
  url       = {https://mlanthology.org/cvprw/2006/hansen2006cvprw-robustifying/}
}