Human-Computer Interaction (HCI) has evolved significantly to incorporate emotion recognition capabilities, creating unprecedented opportunities for adaptive and personalized user experiences. This paper explores the integration of emotion detection into calendar applications, enabling user interfaces to dynamically respond to users’ emotional states and stress levels, thereby enhancing both productivity and engagement. We present and evaluate two complementary approaches to emotion detection: a biometric-based method utilizing heart rate (HR) data extracted from electrocardiogram (ECG) signals processed through Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU) neural networks to predict the emotional dimensions of Valence, Arousal, and Dominance; and a behavioral method analyzing computer activity through multiple machine learning models to classify emotions based on fine-grained user interactions such as mouse movements, clicks, and keystroke patterns. Our comparative analysis, from real-world datasets, reveals that while both approaches demonstrate effectiveness, the computer activity-based method delivers superior consistency and accuracy, particularly for mouse-related interactions, which achieved approximately 90\% accuracy. Furthermore, GRU networks outperformed LSTM models in the biometric approach, with Valence prediction reaching 84.38\% accuracy.
Related Posts
-
More Likes, More Tide? Insights into Award-winning Advertising with Affectiva’s Facial Coding
Consumer Insights
-
Why Dial Testing Alone Isn’t Enough in Media Testing — How to Build on It for Better Results
Consumer Insights
-
The Power of Emotional Engagement: Entertainment Content Testing with Affectiva’s Facial Expression Analysis
Consumer Insights
-
Tracking Emotional Engagement in Audience Measurement is Critical for Industry Success
Consumer Insights