Understanding Drivers’ Physiological Responses in Different Road Conditions

Sara Mostowfi

Jung Hyup Kim

Although the driver’s emotion has been studied in the different driving environments (such as city and highway), understanding what eye metrics and facial expressions correspond to specific emotion and behavior based on subjective and biosensor data to study emotion in depth is not well researched in previous studies. Using an eye-integrated human-in-the-loop (HTIL) simulation experiment, we studied how drivers’ facial expressions and ocular measurements relate to emotions. We found that the driving environment could significantly affect drivers’ emotions, which is evident in their facial expressions and eye metrics data. In addition, such outcomes provide knowledge to human-computer-interaction (HCI) practitioners on designing emotion recognition systems in cars to have a robust understanding of the drivers’ emotions and help progress multimodal emotion recognition.

This publication uses Eye Tracking and Facial Expression Analysis which is fully integrated into iMotions Lab

Learn more