Although the driver’s emotion has been studied in the different driving environments (such as city and highway), understanding what eye metrics and facial expressions correspond to specific emotion and behavior based on subjective and biosensor data to study emotion in depth is not well researched in previous studies. Using an eye-integrated human-in-the-loop (HTIL) simulation experiment, we studied how drivers’ facial expressions and ocular measurements relate to emotions. We found that the driving environment could significantly affect drivers’ emotions, which is evident in their facial expressions and eye metrics data. In addition, such outcomes provide knowledge to human-computer-interaction (HCI) practitioners on designing emotion recognition systems in cars to have a robust understanding of the drivers’ emotions and help progress multimodal emotion recognition.
Related Posts
-
Why Dial Testing Alone Isn’t Enough in Media Testing — How to Build on It for Better Results
Consumer Insights
-
The Power of Emotional Engagement: Entertainment Content Testing with Affectiva’s Facial Expression Analysis
Consumer Insights
-
Tracking Emotional Engagement in Audience Measurement is Critical for Industry Success
Consumer Insights
-
How Real-Time Audience Intelligence Is Revolutionizing Modern Advertising
Consumer Insights