Abstract
This project looks at how people approach collaborative interactions with humans and virtual humans, particularly when encountering ambiguous or unexpected situations. The aim is to create natural and accurate models of users’ behaviors, incorporating social signals and indicators of psychological and physiological states (such as eye movements, galvanic skin response, facial expression and subjective perceptions of an interlocutor) under different conditions, with varying patterns of feedback. The findings from this study will allow artificial agents to be trained to understand characteristic human behaviour exhibited during communication, and how to respond to specific non-verbal cues and biometric feedback with appropriately human-like behaviour. Continuous monitoring of “success” during communication, rather than simply at the end, allows for a more fluid and agile interaction, ultimately reducing the likelihood of critical failure.
Related Posts
-
Tracking Emotional Engagement in Audience Measurement is Critical for Industry Success
Consumer Insights
-
How Real-Time Audience Intelligence Is Revolutionizing Modern Advertising
Consumer Insights
-
The Uncanny Valley And Designing Trust in Human-Robot Interaction
Academia
-
Varjo VR Headsets: The Ultimate Tools for Immersive Neuroscience & Training
Consumer Insights