Understanding human behavior takes time. Attempts to scientifically measure how humans operate have only been carried out in a formal manner for the last couple of hundred years or so [1]. This is a process that could well continue indefinitely – humans, in all their complexity, push back against reduction and simplification, and scientists must develop new ways to understand and measure our behaviors.

But progress does take place. We are now clearer in our understanding of human behavior than we were a century ago. This hasn’t always been a linear path of increased knowledge and understanding, but has led overall to better ideas about who we are.

One of the central ways in which we measure the development of this understanding is by scientific publications that spread new knowledge. We at iMotions have been particularly proud to see the number of publications created using our software grow year-on-year, with almost 200 publications in 2018.

To celebrate this, and highlight some of the developments taking place within human behavior research, we have outlined some of our favorite scientific articles to be produced with iMotions below. The articles aren’t listed in any particular order (and are not necessarily the articles published in the highest impact factor journals), but are a handful that reflect some of the exciting developments taking place across various fields. To get a more complete understanding of the research that is taking place with iMotions, we recommend browsing through our publications page.

Automatic Recognition of Posed Facial Expression of Emotion in Individuals with Autism Spectrum Disorder

facial expression analysis

Continuing their success in developing an understanding of autism (as seen in at least two prior studies using iMotions [2, 3]), the Janssen research group partnered with researchers from Northeastern University, Duke University School of Medicine, University of California San Francisco, and the University of Washington to examine facial expression production in children with autism [4].

The researchers asked both autistic and non-autistic children to produce facial expressions following a prompt. By using automatic facial expression analysis, the expressions could be objectively (and easily) quantified.

The results showed a reduction in the abilities of autistic children to produce certain facial expressions (specifically, for action units related to producing facial expressions related to the emotions – happy, scared, surprised, and disgusted), as compared to non-autistic control children. As autism has been shown to impair facial expression recognition abilities, these findings go one step further and suggest that the expression production is also impaired.

The research could help improve early diagnostics of autism, and provide further information to improve any needed treatment or care of autistic children.

Personality Traits Affect the Influences of Intensity Perception and Emotional Responses on Hedonic Rating and Preference Rank Toward Basic Taste Solutions

taste perception

Researchers from the University of Arkansas sought to understand how emotional responses can impact a core physiological facet of our everyday experience – our sensory perception of taste [5].

By using automatic facial expression analysis, as well as measures of autonomic nervous system (ANS) activity (skin conductance response, heart rate and skin temperature), as well as surveys, the researchers were able to deduce both emotional expressions and their intensity while tasting different solutions. The participants also completed the Big Five Inventory [6], a scale that measures different personality traits.

The participants tasted both strong and mild solutions with sweet, sour, salty, and bitter‐tasting flavors. The results showed that the preference of participants who scored high for measures of neuroticism and extraversion could be predicted with increasing accuracy with the use of surveys, facial expression analysis, and ANS measures (although only minimally for extroverts).

As well as serving as a proof-of-concept for such methodologies for future study, the results provide a better understanding of individual preferences for taste that can be instrumental in industrial food production (as well as certainly an inventive chef or two). This can also be informative in deepening an understanding of disorders that affect the sense of taste.

Frontal Brain Asymmetry and Willingness to Pay

payment neuromarketing

The holy grail of neuromarketing is to find a simple and easily identifiable signal that is highly reliable in terms of predicting how likely someone is to buy something. Researchers from Neurons Inc., Singularity University, Copenhagen Business School, and Technical University of Denmark came one step closer to making this a reality [9].

Participants were presented with images of products on a screen, while their brain activity was recorded using EEG. They were also asked how much money they would spend on each product, but with a twist to most studies – the cash was real. Each participant could bid from a wallet of $230, and ultimately, if the combined bids of participants for an item passed the $230 mark then the participant could receive the item.

The analysis of the EEG results showed that an asymmetry of activation (specifically, an increase of the amount of gamma band activity in the left prefrontal cortex, relative to the right) was a significant predictor of the participant’s willingness to pay.

This builds upon previous research showing that prefrontal asymmetry of alpha activity is associated with approach / avoidance behaviors (with more alpha activity in the left prefrontal cortex being associated with approach, and vice versa [11]).

The results have clear implications for consumer neuroscience, showing that a readily applied measurement of brain activity can reliably predict an individual’s willingness to buy. Neuromarketers and consumer scientists can use these findings to reliably test the potential success of future products, and make better decisions about product launches.

The Effect of Whole-Body Haptic Feedback on Driver’s Perception in Negotiating a Curve

driving perception

While driverless cars may appear to only be a couple of years away from becoming an everyday reality, there are still several legal (and psychological) barriers to their widespread roll-out. Most of these are, rightly, concerned with safety of autonomous vehicles.

Both companies producing the autonomous cars, and researchers at universities, are exploring how to maximize safety for those involved. Researchers from the University of Virginia in particular examined how safety features could be introduced to autonomous cars of the future [7].

This was done with a preliminary investigation (and expanded on in later research [8], and explored by other researchers using iMotions [9]) of how drivers can be alerted to take back control of autonomous cars during sudden emergency situations. EEG and eye tracking data was recorded in iMotions while participants drove in a high-fidelity car simulator.

The simulator was capable of delivering haptic feedback – bodily sensations in the form of a vibrating seat – while participants negotiated a curve in the road. The preliminary results showed that the eyes had both a longer fixation duration and an increased pupil diameter when exposed to the haptic feedback, compared to when no haptic feedback was delivered. This was suggested to be related to increased cognitive engagement – and therefore increased awareness of the situation.

As autonomous cars are developed, features such as haptic feedback could be a helpful tool in alerting drivers to take control, and avoid danger.

The researchers also concluded that the recording of cognitive states (such as through EEG) could be useful in determining whether or not to deliver haptic feedback. By identifying how alert a driver is as a baseline, calculations can be made about whether or not to deliver the haptic feedback as an “early warning system” in order to increase cognitive awareness in drivers that are in danger.

Comparing the Affectiva iMotions Facial Expression Analysis Software with EMG

femg muscles

As with any tool that offers new ways of scientifically measuring phenomena, tests of the apparatus itself must be made. This was the aim of a study by researchers from the university of Göttingen and Leibnitz ScienceCampus, who set out to compare Affectiva’s facial expression analysis system running in iMotions, to facial electromyography (fEMG) recordings [12].

This work is also complemented by previous validations of iMotions, such as work completed by researchers working at the University of Rochester, who compared recordings of facial expressions, skin conductance, and heart rate [13].

Participants in the current study were asked to generate prompted facial expressions, which were recorded by either Affectiva’s automatic facial expression analysis in iMotions, or with fEMG recordings of muscle activation. The researchers found “EMG and software values correlate highly. In conclusion, Affectiva Affdex software can reliably identify emotions and its results are comparable to EMG findings.”.

As fEMG recordings are known to be highly accurate measurements of facial expressions [14], this study provides a thorough validation of Affectiva’s approach within iMotions. This further validation helps pave the way for increased confidence in automatic methods, that can make research both faster and easier to carry out.

Conclusion

While many other articles were completed this year with the help of the iMotions platform, these are among many that present encouraging results for the year ahead. We hope you’ve enjoyed reading about this research, and we look forward to seeing what the next twelve months offer.

 

Human Behavior Pocket Guide Insert

References

[1] Pfister, R. & Schwarz, K. A. (2018). Should We Pre-date the Beginning of Scientific Psychology to 1787? Frontiers in Psychology, 06/12.

[2] Ness, S. L., Manyakov, N. V., Bangerter, A., Lewin, D., Jagannatha, S., Boice, M., et al. (2017). JAKE® multimodal data capture system: Insights from an observational study of autism spectrum disorder. Frontiers in Neuroscience, 11, 517.

[3] Manyakov, N. V., Bangerter, A., Chatterjee, M., Mason, L., Ness, S., Lewin, D., Skalkin, A., Boice, M., Goodwin, M. S., Dawson, G., Hendren, R., Leventhal, B., Shic, F., Pandina, G. (2018). Visual Exploration in Autism Spectrum Disorder: Exploring Age Differences and Dynamic Features Using Recurrence Quantification Analysis. Autism Research. 11(11):1554-1566.

[4] Manfredonia, J., Bangerter, A., Manyakov, N. V., Ness, S., Lewin, D., Skalkin, A., Boice, M., Goodwin, M. S., Dawson, G., Hendren, R., Leventhal, B., Shic, F., and Pandina, G.(2018). Automatic Recognition of Posed Facial Expression of Emotion in Individuals with Autism Spectrum Disorder. J Autism Dev Disord. pp 1–15,  doi: 10.1007/s10803-018-3757-9.

[5] Samant, S. S., Seo, H. S. (2018). Personality traits affect the influences of intensity perception and emotional responses on hedonic rating and preference rank toward basic taste solutions. J Neurosci Res. 18, 1.

[6] Lang FR, John D, Ludtke O, Schupp J, Wagner GG. Short assessment of the Big Five: robust across survey methods except telephone interviewing. Behav Res Methods. 2011;43:548–67.

[7] Pakdamanian, E., Feng, L., Kim, I. (2018). The Effect of Whole-Body Haptic Feedback on Driver’s Perception in Negotiating a Curve. Proceedings of the Human Factors and Ergonomics Society Annual Meeting. 62, 1, 19-23.

[8] Brown, B., Park, D., Sheehan, B., Shikoff, S., Solomon, J., Yang, J. & Kim, I. (2018). Assessment of human driver safety at dilemma zones with automated vehicles
through a virtual reality environment
. Conference of Systems and Information Engineering Design Symposium (SIEDS), 27 April 2018. University of Virginia, Charlottesville, Virginia, USA: IEEE Transactions on Intelligent Transportation Systems Publications.

[9] Izquierdo-Reyes, J., Ramírez-Mendoza, R. A., Bustamante-Bello, R., Pons-Rovira, J. L., and Gonzalez-Vargas, J. (2018). Emotion Recognition for Semi-Autonomous Vehicles Framework. International Journal for Interactive Design and Manufacturing (IJIDeM). DOI: 10.1007/s12008-018-0473-9.

[10] Ramsøy, T. Z., Skov, M., Christensen, M. K., & Stahlhut, C. (2018). Frontal brain asymmetry and willingness to pPay. Frontiers in Neuroscience, 12, 138. http://dx.doi.org/10.3389 /fnins.2018.00138.

[11] Pizzagalli, D. A., Sherwood, R. J., Henriques, J. B., and Davidson, R. J. (2005). Frontal brain asymmetry and reward responsiveness: a source-localization study. Psychol. Sci. 16, 805–813. doi: 10.1111/j.1467-9280.2005.01618.x

[12] Kulke, L., Feyerabend, D., Schacht, A. (2018). Comparing the Affectiva iMotions Facial Expression Analysis Software with EMG. DOI: 10.31234/osf.io/6c58y.

[13] Lei, J., Sala, J., Jasra, S. (2018). Identifying correlation between facial expression and heart rate and skin conductance with iMotions biometric platform. Journal of Emerging Forensic Sciences Research.[/fusion_builder_column][/fusion_builder_row][/fusion_builder_container]