Gaze-Based Interaction Adaptation for People with Involuntary Head Movements (Student Abstract)

Abstract

Gaze estimation is an important research area in computer vision and machine learning. Eye-tracking and gaze-based interactions have made assistive technology (AT) more accessible to people with physical limitations. However, a non-negligible proportion of existing AT users, including those having dyskinetic cerebral palsy (CP) or severe intellectual disabilities (ID), have difficulties in using eye trackers due to their involuntary body movements. In this paper, we propose an adaptation method pertaining to head movement prediction and fixation smoothing to stabilize our target users' gaze points on the screen and improve their user experience (UX) in gaze-based interaction. Our empirical experimentation shows that our method significantly shortens the users' selection time and increases their selection accuracy.

Cite

Text

Tong and Chan. "Gaze-Based Interaction Adaptation for People with Involuntary Head Movements (Student Abstract)." AAAI Conference on Artificial Intelligence, 2024. doi:10.1609/AAAI.V38I21.30519

Markdown

[Tong and Chan. "Gaze-Based Interaction Adaptation for People with Involuntary Head Movements (Student Abstract)." AAAI Conference on Artificial Intelligence, 2024.](https://mlanthology.org/aaai/2024/tong2024aaai-gaze/) doi:10.1609/AAAI.V38I21.30519

BibTeX

@inproceedings{tong2024aaai-gaze,
  title     = {{Gaze-Based Interaction Adaptation for People with Involuntary Head Movements (Student Abstract)}},
  author    = {Tong, Cindy and Chan, Rosanna Yuen-Yan},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2024},
  pages     = {23669-23670},
  doi       = {10.1609/AAAI.V38I21.30519},
  url       = {https://mlanthology.org/aaai/2024/tong2024aaai-gaze/}
}