Although the driver’s emotion has been studied in the different driving environments (such as city and highway), understanding what eye metrics and facial expressions correspond to specific emotion and behavior based on subjective and biosensor data to study emotion in depth is not well researched in previous studies. Using an eye-integrated human-in-the-loop (HTIL) simulation experiment, we studied how drivers’ facial expressions and ocular measurements relate to emotions. We found that the driving environment could significantly affect drivers’ emotions, which is evident in their facial expressions and eye metrics data. In addition, such outcomes provide knowledge to human-computer-interaction (HCI) practitioners on designing emotion recognition systems in cars to have a robust understanding of the drivers’ emotions and help progress multimodal emotion recognition.
Related Posts
-
Online Interviews: How to Best Measure Emotional Engagement & Valence Online
-
How Biosensors Help Contextualize Type I and Type II Errors in Experimental Psychology Research
-
10 Areas Where Simulation Research Delivers Deep Behavioral Insight
-
Multiface Analysis in Action: Advanced Methods for Studying Facial Expressions in Group Settings
