EDDIE: An Embodied AI System for Research and Intervention for Individuals with ASD

Abstract

We report on the ongoing development of EDDIE (Emotion Demonstration, Decoding, Interpretation, and Encoding), an interactive embodied AI to be deployed as an intervention system for children diagnosed with High-Functioning Autism Spectrum Disorders (HFASD). EDDIE presents the subject with interactive requests to decode facial expressions presented through an avatar, encode requested expressions, or do both in a single session. Facial tracking software interprets the subject’s response, and allows for immediate feedback. The system fills a need in research and intervention for children with HFASD by providing an engaging platform for presentation of exemplar expressions consistent with mechanical systems of facial action measurement integrated with an automatic system for interpreting and giving feedback to the subject’s expressions. Both live interaction with EDDIE and video recordings of human-EDDIE interaction will be demonstrated.

Cite

Text

Selkowitz et al. "EDDIE: An Embodied AI System for Research and Intervention for Individuals with ASD." AAAI Conference on Artificial Intelligence, 2016. doi:10.1609/AAAI.V30I1.9845

Markdown

[Selkowitz et al. "EDDIE: An Embodied AI System for Research and Intervention for Individuals with ASD." AAAI Conference on Artificial Intelligence, 2016.](https://mlanthology.org/aaai/2016/selkowitz2016aaai-eddie/) doi:10.1609/AAAI.V30I1.9845

BibTeX

@inproceedings{selkowitz2016aaai-eddie,
  title     = {{EDDIE: An Embodied AI System for Research and Intervention for Individuals with ASD}},
  author    = {Selkowitz, Robert and Rodgers, Jonathan and Moskal, Przemyslaw J. and Mrowczynski, Jon and Colson, Christine},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2016},
  pages     = {4385-4386},
  doi       = {10.1609/AAAI.V30I1.9845},
  url       = {https://mlanthology.org/aaai/2016/selkowitz2016aaai-eddie/}
}