Using Eye-Tracking Data for High-Level User Modeling in Adaptive Interfaces

Abstract

In recent years, there has been substantial research on ex-ploring how AI can contribute to Human-Computer In-teraction by enabling an interface to understand a user’s needs and act accordingly. Understanding user needs is especially challenging when it involves assessing the user’s high-level mental states not easily reflected by in-terface actions. In this paper, we present our results on using eye-tracking data to model such mental states dur-ing interaction with adaptive educational software. We then discuss the implications of our research for Intelli-gent User Interfaces. Introduction1 One of the main challenges in devising agents that can act intelligently is to endow them with the capability of un-

Cite

Text

Conati et al. "Using Eye-Tracking Data for High-Level User Modeling in Adaptive Interfaces." AAAI Conference on Artificial Intelligence, 2007.

Markdown

[Conati et al. "Using Eye-Tracking Data for High-Level User Modeling in Adaptive Interfaces." AAAI Conference on Artificial Intelligence, 2007.](https://mlanthology.org/aaai/2007/conati2007aaai-using/)

BibTeX

@inproceedings{conati2007aaai-using,
  title     = {{Using Eye-Tracking Data for High-Level User Modeling in Adaptive Interfaces}},
  author    = {Conati, Cristina and Merten, Christina and Amershi, Saleema and Muldner, Kasia},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2007},
  pages     = {1614-1617},
  url       = {https://mlanthology.org/aaai/2007/conati2007aaai-using/}
}