Pattern

Computer Science Research

with iMotions: world’s leading human behavior software

Make technology human with human data

Sensitivity and Specificity

The ascent of machine learning and emotion AI has been decades in the making, and has now come to prominence. Algorithmic approaches can benefit from rich, detailed human behavior data, to make technology more accurate, and more human.

We’re dedicated to helping you understand human behavior, so you can build technology that reflects and acts in response to actual humans.

Screenshot of advert experiment using iMotions

Leverage physiological data

Create richer computational models

To build human technology that is able to predict and anticipate human behavior, computer scientists need detailed human behavior data. Gathering this data has typically required complex, rigid, and expensive experimental setups, taking time and resources away from understanding and utilizing the data that is collected.

iMotions is a human behavior lab in one software. By readily integrating a range of biosensors, it’s possible to collect detailed human behavior data in almost any scenario, whether it’s cognitive responses in VR, attention in a driving simulator, or emotional expressions when interacting with an AI system. The iMotions software is designed to make data collection straightforward, so that you can understand human behavior data in its complexity.

iMotions is used at over 800 universities worldwide

Machine Learning & Data Science

Quality in, quality out

The performance of machine learning models is largely dependent on the data that it’s trained with. Inaccurate or limited data won’t be able to guide algorithms in the right direction, no matter how sophisticated they might be. Providing detailed and direct quantifications of the cognitive and physiological computational processes that are performed constantly by humans, provides a clear basis from which to model and predict human action. Use iMotions to:

  • Train machine learning algorithms with multimodal learning with facial expression analysis
  • Investigate the relationships between different forms of communication with multimodal language analysis
  • Validate model performance by testing predictions against quantifiable human actions
  • Collect data about human responses in virtual worlds, to predict reactions in real environments

Tech Innovation with Emotion AI

Helping the future be human

Human-centric AI is the next frontier for technological innovation. By making vehicles, devices, and software responsive to human behavior and emotions, their use becomes not only more practical, but also more streamlined, smoother, and safer. iMotions is a proven software for guiding accurate assessments of human action, giving you access to data that truly reflects how humans respond in any environment. With iMotions, you can build models that are not just intelligent, but emotionally intelligent. Use iMotions to explore:

  • How behavioral cues for fatigue or engagement guide human action when operating a vehicle
  • How design impacts action efficiency
  • Dynamic changes in emotional responses when attending to instructions
  • Predictive human performance with EEG and eye tracking
Headshot photos of people portraying different emotions
Gaze mapping magazine euroman

Data transparency

Tools for validating your data

Generating valid, scalable data requires transparency at every stage: from collection to processing and analysis. iMotions provides you with the tools to do predictive modeling, train algorithms, and perform signal processing with fully customizable options according to your research design, including:

  • Facial Expression Analysis technology through our partnership with Affectiva, with data sets from 7.5 million faces from 87 countries
  • R Notebooks with fully visible code for flexible signal processing
  • Eye tracking analysis features backed by computer vision algorithms

Key features to simplify and amplify your work

Connect. Record. Process.

 
undefined
Synchronize a range of sensors

In a unified software platform

 
undefined
Design, present, and record

The entire experimental process together

 
undefined
Fully adaptable

Modify as you need with the API and/or LSL connections

 
undefined
Automatic analysis

Get results quickly from calculated metrics

Explore Integrated Hardware

Seamlessly combine hardware from our vendor partners

Using biosensors for behavioral modeling could include any of the sensors in the following example:

Our hardware-agnostic approach enables integration from these partners and more:
Affectiva
Neuroelectrics
Varjo
Shimmer
SmartEye
Tobii
GazePoint
Biopack
Pupil Labs
Qualtrics

Objective Data for Understanding Emotional Expressions

AcademiaCustomer case

iMotions helps Oxford University address entirely new areas of research into emotions. By utilizing an objective and automated system for measurements of facial expressions, Dr. Danielle Shore is able to rapidly explore emotional reactions in game theory contexts.

Award winning Game Design Research

AcademiaCustomer case

Dr. Alessandro Canossa at Northeastern University conducts sophisticated player behavior studies using iMotions in a fraction of the time as compared to before iMotions. He’s won a best paper award for this work.

How do biosensors help create an understanding of human behavior?

Present your stimuli, synchronize biosensors, manage respondent data, and analyze human responses – all in the iMotions Platform. Click on an icon to learn more about each biosensor module.

Research made possible with iMotions

Carnegie Mellon University – Amir Zadeh, Chengfeng Mao, Kelly Shi, Yiwei Zhang, Paul Liang, Soujanya Poria & Louis-Philippe Morency

Researchers at the Carnegie Mellon University used iMotions to assess multimodal machine learning performance in relation to physiological signals.

Want to know more?

Read our collection of articles within human behavior research

undefined
Publications

Read publications made possible with iMotions

undefined
Blog

Follow our blog contributions from our expert PhDs

undefined
Newsletter

A monthly close up of latest product and research news

iMotions is used in over 70 countries

Our offices are situated worldwide in close contact with our customers

World map with iMotions office locations