University of Missouri
Understanding Drivers’ Physiological Responses in Different Road Conditions
Although the driver’s emotion has been studied in the different driving environments (such as city and highway), understanding what eye metrics and facial expressions correspond to specific emotion and behavior based on subjective and biosensor data to study emotion in depth is not well researched in previous studies. Using an eye-integrated human-in-the-loop (HTIL) simulation experiment, we studied how drivers’ facial expressions and ocular measurements relate to emotions. We found that the driving environment could significantly affect drivers’ emotions, which is evident in their facial expressions and eye metrics data. In addition, such outcomes provide knowledge to human-computer-interaction (HCI) practitioners on designing emotion recognition systems in cars to have a robust understanding of the drivers’ emotions and help progress multimodal emotion recognition.
This study employs Eye Tracking done with glasses, which is fully integrated into the iMotions software suite. To learn more please visit our dedicated product page, or download our complete guide on Eye Tracking below: