Modeling Students’ Emotions in Computing Education: A Context-Specific Multi-Modal Approach

FNU Rakhi

Emotions are context-specific and unfold dynamically during tasks such as programming. However, most emotion recognition systems are built using generic data and often overlook contextual nuances. This dissertation bridges affective computing and computing education by developing a context-specific, multi-modal dataset that integrates physiological, behavioral, and self-report data from programming students. Building on this dataset, I explore hybrid and transformer-based fusion techniques for emotion recognition and emotion-cause analysis. The goal is to inform the design of affect-aware learning systems that adapt to students’ learning needs to improve engagement and persistence in computing education.

This publication uses EMG and Eye Tracking which is fully integrated into iMotions Lab

Learn more

Learn more about the technologies used

Other publications you might be interested in