Abstract
This project looks at how people approach collaborative interactions with humans and virtual humans, particularly when encountering ambiguous or unexpected situations. The aim is to create natural and accurate models of users’ behaviors, incorporating social signals and indicators of psychological and physiological states (such as eye movements, galvanic skin response, facial expression and subjective perceptions of an interlocutor) under different conditions, with varying patterns of feedback. The findings from this study will allow artificial agents to be trained to understand characteristic human behaviour exhibited during communication, and how to respond to specific non-verbal cues and biometric feedback with appropriately human-like behaviour. Continuous monitoring of “success” during communication, rather than simply at the end, allows for a more fluid and agile interaction, ultimately reducing the likelihood of critical failure.
Related Posts
-
The Science of Resilience: Measuring the Ability to Bounce Back
Academia
-
Measuring Pain: Advancing The Understanding Of Pain Measurement Through Multimodal Assessment
Ergonomics
-
Feeling at Home: How to Design a Space Where the Brain can Relax
Ergonomics
-
More Likes, More Tide? Insights into Award-winning Advertising with Affectiva’s Facial Coding
Consumer Insights