In children with autism spectrum disorders (ASDs), assessing attention is crucial to understanding their behavioral and cognitive functioning. Attention difficulties are a common challenge for children with autism, significantly impacting their learning and social interactions. Traditional assessment methods often require skilled professionals to provide personalized interventions, which can be time consuming. In addition, existing approaches based on video and eye-tracking data have limitations in providing accurate educational interventions. This article proposes a noninvasive and objective method to assess and quantify attention levels in children with autism by utilizing head poses and gaze parameters. The proposed approach combines a deep learning model for extracting head pose parameters, algorithms to extract gaze parameters, machine learning models for the attention assessment task, and an ensemble of Bayesian neural networks for attention quantification. We conducted experiments involving 39 children (19 with ASD and 20 neurotypical children) by assigning various attention tasks and capturing their video and eye patterns using a webcam and an eye tracker. Results are analyzed for participant and task differences, which demonstrate that the proposed approach is successful in measuring a child’s attention control and inattention. Ultimately, the developed attention assessment method using head poses and gaze parameters opens the door to developing real-time attention recognition systems that can enhance learning and provide targeted interventions.
Scientific Publications from Researchers Using iMotions
iMotion is used for some of the most interesting human behavior research studies done by top researchers around the world. Contact us to have your publication featured here.
All Publications