Using Electroencephalography to Understand Learning Engagement with User-Centered Human-Computer Interaction in a Multimodal Online Learning Environment

Michael Poplin

Jiahui Ma

Tadeo Aviles Zuniga

Nadezhda (Nadya) Modyanova

Elizabeth A. Johnson

Bernadette McCrory

Multimodal learning environments (MMLA) use visual, auditory, and physical interactions to improve engagement in learning tasks. A recent study by Ma et al. [16] demonstrated how biosensors such as GSR (Galvanic Skin Response), eye tracking, and facial expressions can track emotional and cognitive engagement. Building on that work, we are incorporating EEG (electroencephalography) data, focusing on Frontal Alpha Asymmetry (FAA), which measures differences in activity between the left and right prefrontal cortices. FAA is linked to engagement, with greater left-frontal activity associated with goal-directed behavior and positive emotional states. A recurring challenge in FAA research is inconsistent results, due to weak stimuli or overly smoothed EEG data that obscure transient patterns. To address this, we use wavelet transforms, which preserve both temporal and frequency details. This lets us detect subtle changes in alpha power, which is inversely related to cortical activity and often reflects changes in engagement, while linking these shifts to trends in time-synchronized biosensor data. We are exploring FAA analyses in 3 participants during a learning task in Quality Trainer (Minitab), and find that the stimuli reliably trigger FAA shifts, aligning with GSR and facial expression responses. By integrating wavelet-enhanced EEG analysis with time-synchronized biosensor data, this approach offers a better understanding of engagement and cross-modal patterns in real time. These findings have significant implications for developing adaptive learning systems and human-computer interaction models. Furthermore, this work highlights the importance of rigorous, detail-preserving analysis to improve the accuracy and reliability of engagement metrics in multimodal environments

This publication uses EEG which is fully integrated into iMotions Lab

Learn more

Learn more about the technologies used

Other publications you might be interested in