
Computer Science Research
with iMotions: world’s leading human behavior software
Make technology human with human data
Sensitivity and Specificity
The ascent of machine learning and emotion AI has been decades in the making, and has now come to prominence. Algorithmic approaches can benefit from rich, detailed human behavior data, to make technology more accurate, and more human.
We’re dedicated to helping you understand human behavior, so you can build technology that reflects and acts in response to actual humans.

Leverage physiological data
Create richer computational models
To build human technology that is able to predict and anticipate human behavior, computer scientists need detailed human behavior data. Gathering this data has typically required complex, rigid, and expensive experimental setups, taking time and resources away from understanding and utilizing the data that is collected.
iMotions is a human behavior lab in one software. By readily integrating a range of biosensors, it’s possible to collect detailed human behavior data in almost any scenario, whether it’s cognitive responses in VR, attention in a driving simulator, or emotional expressions when interacting with an AI system. The iMotions software is designed to make data collection straightforward, so that you can understand human behavior data in its complexity.
iMotions is used at over 800 universities worldwide









Machine Learning & Data Science
Quality in, quality out
The performance of machine learning models is largely dependent on the data that it’s trained with. Inaccurate or limited data won’t be able to guide algorithms in the right direction, no matter how sophisticated they might be. Providing detailed and direct quantifications of the cognitive and physiological computational processes that are performed constantly by humans, provides a clear basis from which to model and predict human action. Use iMotions to:
- Train machine learning algorithms with multimodal learning with facial expression analysis
- Investigate the relationships between different forms of communication with multimodal language analysis
- Validate model performance by testing predictions against quantifiable human actions
- Collect data about human responses in virtual worlds, to predict reactions in real environments
Tech Innovation with Emotion AI
Helping the future be human
Human-centric AI is the next frontier for technological innovation. By making vehicles, devices, and software responsive to human behavior and emotions, their use becomes not only more practical, but also more streamlined, smoother, and safer. iMotions is a proven software for guiding accurate assessments of human action, giving you access to data that truly reflects how humans respond in any environment. With iMotions, you can build models that are not just intelligent, but emotionally intelligent. Use iMotions to explore:
- How behavioral cues for fatigue or engagement guide human action when operating a vehicle
- How design impacts action efficiency
- Dynamic changes in emotional responses when attending to instructions
- Predictive human performance with EEG and eye tracking


Data transparency
Tools for validating your data
Generating valid, scalable data requires transparency at every stage: from collection to processing and analysis. iMotions provides you with the tools to do predictive modeling, train algorithms, and perform signal processing with fully customizable options according to your research design, including:
- Facial Expression Analysis technology through our partnership with Affectiva, with data sets from 7.5 million faces from 87 countries
- R Notebooks with fully visible code for flexible signal processing
- Eye tracking analysis features backed by computer vision algorithms
Key features to simplify and amplify your work
Connect. Record. Process.
Explore Integrated Hardware
Seamlessly combine hardware from our vendor partners
Using biosensors for behavioral modeling could include any of the sensors in the following example:








Research made possible with iMotions
Carnegie Mellon University – Amir Zadeh, Chengfeng Mao, Kelly Shi, Yiwei Zhang, Paul Liang, Soujanya Poria & Louis-Philippe Morency
Researchers at the Carnegie Mellon University used iMotions to assess multimodal machine learning performance in relation to physiological signals.
Want to know more?
Read our collection of articles within human behavior research
iMotions is used in over 70 countries
Our offices are situated worldwide in close contact with our customers
