-
Experimental Setup and Protocol for Creating an EEG-signal Database for Emotion Analysis Using Virtual Reality Scenarios
Automatic emotion recognition systems aim to identify human emotions from physiological signals, voice, facial expression or even physical activity. Among these types of signals, the usefulness of signals from electroencephalography (EEG) should be highlighted. However, there are few publicly accessible EEG databases in which the induction of emotions is performed through virtual reality (VR) scenarios. […] -
Analyzing motivating functions of consumer behavior: Evidence from attention and neural responses to choices and consumption
Academia and business have shown an increased interest in using neurophysiological methods, such as eye-tracking and electroencephalography (EEG), to assess consumer motivation. The current research contributes to this literature by verifying whether these methods can predict the effects of antecedent events as motivating functions of attention, neural responses, choice, and consumption. Antecedent motivational factors are […] -
Emerging biometric methodologies for human behaviour measurement in applied sensory and consumer science
This chapter covers some of the most popular emerging technologies used for measuring human behaviour in applied sensory and consumer science. Here, we focus on eye-tracking (ET) technology, electrodermal activity (EDA) or skin conductance, facial expression analysis (FEA) and electroencephalography (EEG), all of which can be employed to explore the underlying and at times unconscious […] -
The effects of expressions of fear induced by background music on reading comprehension
Research has suggested that background music can have a positive or negative effect that can influence the affective state of individuals. Although research has demonstrated that fear negatively influences our cognitive performance, there is a research gap in understanding the combined effects of different background music tempo and fear in influencing reading comprehension performance. Data […] -
Children’s physiological and behavioural response evoked by the observation, olfaction, manipulation, and consumption of food textures. Part 1: liquid products
Children are thought to prefer homogeneous and simple textures that are easy to manipulate in the mouth. Although scientific research has been done on children’s acceptance for food textures, there is a lack of knowledge regarding the emotional response elicited by textures in this group of population. Physiological and behavioural methods could be an appropriate […] -
Mobile Eye-Tracking as a Research Method to Explore the D/Deaf Experience at Arts and Cultural Venues
D/deaf activists have consistently lamented their exclusion from the decision-making process by service providers. Accessibility is only effective when designed with contributions from those affected by the perceived or known barrier. This paper redresses the historic absence of the D/deaf paradigm, and recenters the focus to the individual’s perspective of accessibility requirements by developing a […] -
Offline Calibration for Infant Gaze and Head Tracking across a Wide Horizontal Visual Field
Most well-established eye-tracking research paradigms adopt remote systems, which typically feature regular flat screens of limited width. Limitations of current eye-tracking methods over a wide area include calibration, the significant loss of data due to head movements, and the reduction of data quality over the course of an experimental session. Here, we introduced a novel […] -
Bridging social marketing and technology in the disability field: an empirical study on the role of cybernetic avatar and social inclusion
Purpose This study aims to determine the perception and attitude of consumers toward the presence of cybernetic avatars (CAs) as part of a social inclusion initiative. Design/methodology/approach A mixed method was used to conduct the study using facial recognition expressions and surveys. Three studies were conducted. Study 1 examines consumers’ attitudes and perceptions of a […] -
Differentiating Use of Facial Expression between Individuals with and without Traumatic Brain Injury Using Affectiva Software: A Pilot Study
This study investigated the feasibility of using an automated facial coding engine, Affectiva (integrated in iMotions, version 8.2), for evaluating facial expression after traumatic brain injury (TBI). An observational cross-sectional study was conducted based on facial expression data from videos of participants with TBI and control participants. The aims were to compare TBI and control […] -
Granting a Better Verdict of the Mini-Mental State Examination (MMSE) With New Technologies
There are millions of people in the world who have been diagnosed with dementia and this condition not only directly affects the patient, but also their family members and caregivers; That is why it is sought to have a verdict in which it can be reliably seen if a person suffers from dementia. The way […]
Research Report 2024
In-depth look at the scientific landscape as powered by iMotions software, showcasing groundbreaking research and the impact of our tools in various scientific and industrial fields.

iMotions Science Resources
Looking for white papers, validation reports or research show casing iMotions Multimodal capabilities?
Share Your Research

850+ universities worldwide with an iMotions human behavior lab
73 of the top 100 highest ranked universities
710+ published research papers using iMotions
iMotions is used for some of the most interesting human behavior research studies carried out by top researchers around the world. Contact us to have your publication featured here.
The authors of these publications have used iMotions as a software tool within their research.
“Software should be cited on the same basis as any other research product such as a paper or a book; that is, authors should cite the appropriate set of software products just as they cite the appropriate set of papers” (Katz et al., 2020).
We therefore encourage you to cite the use of iMotions where appropriate.
How to cite iMotions
APA
iMotions (10), iMotions A/S, Copenhagen, Denmark, (2024).
Note: adjust the version and year where relevant.
5 Most Popular Blogs
- The International Affective Picture System- IAPS [Explained and Good Alternatives]
- Understanding Human Behavior – A Physiological Approach
- The Stroop Effect – How it Works and Why Is Has A Profound Impact
- Your Menu Is Your Most Powerful Marketing Asset
- T’is The Season: Taste-Testing Research and The Legend of the Pumpkin Spice Latte
Learn How to Conduct Human Behavior Research with iMotions
Publications
Read publications made possible with iMotions
Blog
Get inspired and learn more from our expert content writers
Newsletter
A monthly close up of latest product and research news