iMotions 6.1 released – Improved workflows and signal processing, VR environment testing & much more
How it works – iMotions in under 2 minutes
See how the iMotions Biometric Research Platform works
Powering the world’s leading research Labs
iMotions is used by the best and brightest, including:
Dr. Erin MacDonald, Assistant Professor of Mechanical Engineering, Stanford University
“If I compare to other software that we have been using before, then we always designed a study that would fit to the software. Now we just design our study and we know that iMotions can handle it. I can do things so much faster with iMotions and they even understand the budget of an assistant professor.”
Prof. Roger Azevedo, Professor in the Department of Psychology, North Carolina State University
“iMotions has enabled us to do research we couldn’t even get close to do before. Even though we had all the technology available, it was not possible to create reliable and synchronized studies in a fast way, so a lot of students and researchers could work with the tools at the same time.”
“After several years of using a number of different technologies we have transitioned to using the iMotions Software, which now seamlessly integrates our main technologies into one unified platform. The platform’s scalability as well as its ability to build up against the input/output API, makes it an ideal flexible solution for our multi-site research solutions.”
World class integrations
Unparalleled integrations with leading hardware and software innovators
iMotions Biometric Research Platform Technologies
Tracks the activity & reactions of the eye.
2 types: Remote (desktop) & Mobile (head mounted glasses).
- Quantifies and analyzes visual attention
- Reveals how, when & what people see, gaze position & pupil dilation
- Sensor: Remote & mobile eye trackers from Tobii, ASL, EyeTech, The Eye Tribe, etc…
Facial Expression Recognition
Analyze facial expressions and extract human emotional reactions. Based on the FACS system of Paul Ekman.
- Reveals manifestations of underlying emotional states
- Recognizes universal basic emotions and valence
- Sensor: Any webcam
Get insights into brain reactions and arousal by measuring electrical activity along the scalp.
- Reveals perceptual, cognitive and emotional processing
- Measures: Motivation, Engagement and Workload
- Sensor: ABM B-Alert & Emotiv EPOC head sets
GSR / EDA
Get insights into human arousal and stress by measuring skin conductance.
- Reveals emotional responses of a person via sweat gland activity
- Measures the psycho-physiological arousal of a person
- Sensor: Shimmer3 GSR+ Unit & Affectiva Q Sensor
ECG / EMG
Records the spatio-temporal characteristics of electrical impulses associated with muscle contractions.
- Measure the heart’s response to physical exertion
- Analyze biometrics of human movement
- Sensor: Shimmer3 EXG Unit
API – Any 3rd Party Metric or Sensor
Input & Output API for full flexibility and feedback loop interaction.
- Forward data from iMotions to third party applications
- Forward data from third party applications to iMotions
- Sensor: Any 3rd party sensor, software or algorithm
All sensors integrated & synchronized in one software platform
Plug-and-play any sensor & synchronize it with any type of stimuli (images, videos, websites, screen recordings, real objects, surveys)
iMotions UI with recording of respondent’s face, eye tracking on a survey slide & facial expression & GSR channels synchronized.
Human Behaviour Research Mega Lab
iMotions provides a unique toolbox to the multi-disciplinary human behavior research lab at the University of Nebraska at Omaha (ONO) to answer some of the most advanced scientific hypotheses in the world.
Find out why the world’s leading researchers use iMotions