Although the driver’s emotion has been studied in the different driving environments (such as city and highway), understanding what eye metrics and facial expressions correspond to specific emotion and behavior based on subjective and biosensor data to study emotion in depth is not well researched in previous studies. Using an eye-integrated human-in-the-loop (HTIL) simulation experiment, we studied how drivers’ facial expressions and ocular measurements relate to emotions. We found that the driving environment could significantly affect drivers’ emotions, which is evident in their facial expressions and eye metrics data. In addition, such outcomes provide knowledge to human-computer-interaction (HCI) practitioners on designing emotion recognition systems in cars to have a robust understanding of the drivers’ emotions and help progress multimodal emotion recognition.
Related Posts
-
The Science of Resilience: Measuring the Ability to Bounce Back
Academia
-
Measuring Pain: Advancing The Understanding Of Pain Measurement Through Multimodal Assessment
Ergonomics
-
Feeling at Home: How to Design a Space Where the Brain can Relax
Ergonomics
-
More Likes, More Tide? Insights into Award-winning Advertising with Affectiva’s Facial Coding
Consumer Insights