Machine learning for distinguishing saudi children with and without autism via eye-tracking data

Hana Alarifi

Hesham Aldhalaan

Nouchine Hadjikhani

Jakob Åsberg Johnels

Jhan Alarifi

Guido Ascenso

Reem Alabdulaziz

Background

Despite the prevalence of Autism Spectrum Disorder (ASD) globally, there’s a knowledge gap pertaining to autism in Arabic nations. Recognizing the need for validated biomarkers for ASD, our study leverages eye-tracking technology to understand gaze patterns associated with ASD, focusing on joint attention (JA) and atypical gaze patterns during face perception. While previous studies typically evaluate a single eye-tracking metric, our research combines multiple metrics to capture the multidimensional nature of autism, focusing on dwell times on eyes, left facial side, and joint attention.

Methods

We recorded data from 104 participants (41 neurotypical, mean age: 8.21 ± 4.12 years; 63 with ASD, mean age 8 ± 3.89 years). The data collection consisted of a series of visual stimuli of cartoon faces of humans and animals, presented to the participants in a controlled environment. During each stimulus, the eye movements of the participants were recorded and analyzed, extracting metrics such as time to first fixation and dwell time. We then used these data to train a number of machine learning classification algorithms, to determine if these biomarkers can be used to diagnose ASD.

Results

We found no significant difference in eye-dwell time between autistic and control groups on human or animal eyes. However, autistic individuals focused less on the left side of both human and animal faces, indicating reduced left visual field (LVF) bias. They also showed slower response times and shorter dwell times on congruent objects during joint attention (JA) tasks, indicating diminished reflexive joint attention. No significant difference was found in time spent on incongruent objects during JA tasks. These results suggest potential eye-tracking biomarkers for autism. The best-performing algorithm was the random forest one, which achieved accuracy = 0.76 ± 0.08, precision = 0.78 ± 0.13, recall = 0.84 ± 0.07, and F1 = 0.80 ± 0.09.

Conclusions

Although the autism group displayed notable differences in reflexive joint attention and left visual field bias, the dwell time on eyes was not significantly different. Nevertheless, the machine algorithm model trained on these data proved effective at diagnosing ASD, showing the potential of these biomarkers. Our study shows promising results and opens up potential for further exploration in this under-researched geographical context.

This publication uses Eye Tracking which is fully integrated into iMotions Lab

Learn more