ISCA Archive Interspeech 2025
ISCA Archive Interspeech 2025

Accessible Real-time Eye-gaze Tracking for Neurocognitive Health Assessment: A Multimodal Web-based Approach

Daniel Tisdale, Jackson Liscombe, David Paulter, Michael Neumann, Vikram Ramanarayanan

We introduce a novel integration of real-time predictive eye-gaze tracking models into a scalable cloud-based multimodal dialogue system tailored for remote health assessments. A virtual human guide interacted with 10 participants with Mild Cognitive Impairment (MCI) and 29 healthy controls during a crowdsourced pilot study. She engaged participants in an approximately 10-minute interview with 11 interactive eye-gaze tasks, ranging from simple free gaze exploration to more specialized directed gaze tasks. We found that metrics of eye-gaze dynamics and reaction times extracted from these tasks, combined with an Adaboost classifier effectively differentiates MCI participants from healthy adults, with an average accuracy of 0.94. Furthermore, over 80% of participants reported high engagement and liked the user experience with the platform, across demographics and internet connections, demonstrating its robustness and scalability for real world use.