View of part of the earth from space with lit-up cities

Tech & Engineering Research

Human-centered design with the world’s leading human behavior software platform

Let human behavior drive product innovation

Human behavior research technology quantifies the cognitive and emotional processes triggered when people use products. iMotions helps you detect these human emotions and responses so your product development, testing and usability can align with people’s needs.

Put human factors front and center

From concept development to real-time observation

Using biosensors as part of your R&D strategy enables faster, more effective results across the product lifecycle. Incorporating methods like eye tracking and facial expression analysis can be done at the early concept stages of a new product for initial insights on usability, or during prototyping, product testing, and innovation phases. Researching human behavior can even be applied to machine interfaces, software testing, or human-robot interaction to understand or minimize human error.

Engineering and R&D projects integrating biosensors with iMotions are currently researching:

  • Haptic interfaces and haptic sensing in virtual and robotic interactions
  • Sensing of trust in Human-Machine Interaction
  • Behavioral psychology and integrating human judgments into the design process
  • Automating data outputs from the iMotions API to third-party software

iMotions is used at leading tech companies and universities worldwide

Test and iterate by soliciting underlying human responses

Leverage the power of multiple data streams

iMotions integrates numerous methods for measurement of emotion by quantifying physiological data with low intrusiveness such as Facial Expression Analysis, Eye Tracking, Electrodermal Activity / Galvanic Skin Response and EEG. Our API and Lab Streaming Layer are also key components to feeding inputs and outputs through almost any third-party system.

iMotions customers can take advantage of the API for capabilities like creating triggers, using external events to control iMotions remotely, and changing the experience in simulations based on behavior.

woman wearing a Virtual reality headset second screen 2 hands in front of a wall and emotional measurement graph

Benefits of neurotechnology software for research

Integrate innovative methodologies to product ideation and testing.

 
undefined
Save time and money

Better synchronization & instant analysis solutions allow a faster, cheaper turnaround in product testing and design validation.

 
undefined
Launch with success

Testing based on nonconscious human responses surfaces users’ expectations and requirements, so you have greater probability of first-time success during product launch.

 
undefined
Conduct better studies

Collecting data from more sources with the same synchronization method, as well as combining multiple studies, ensures scalability, validity & reliability.

 
undefined
Future-proof

iMotions integrates with the newest tools as they become available, can import and export many data sources through our API & LSL, and even supports compatibility with certain legacy hardware.

Engineering applications

Use iMotions to detect human emotions and their causes in driving environments and shift the focus on design towards developing better understanding of driver needs, desires and reactions.

Record eye tracking, facial expressions, EEG and other metrics from the driver in sync with input events from the driving session. Assess and understand where users are looking and what they fail to see and study how users are perceiving and responding to new devices and solutions.

Driver studies · Human-computer interaction · Machine Learning & AI · Software R&D

A car simulator lab for a Mythbusters experiment, powered by the iMotions API

AcademiaCustomer case

The MythBusters team wanted to determine whether driving while talking hands free is really less dangerous than talking while holding a cell phone. iMotions powered the experiment setup, integrating a car simulator with biosensors. The iMotions API was used to receive trigger events and to send the driver’s biosignal data in sync with their driving actions for further analysis.

vr prototyping

R&D and Iterative Testing

During or after the design phase, measuring physiological and emotional responses to products or prototypes will help you unearth design elements or usability features that have gone unnoticed, need improvement, or attract attention. Biosensor data help you better quantify and validate real-time results so that you save time on designing products that achieve what you intend. 

From A/B and benchmark testing to experience measurement and think-alouds, we help you assess your product design and effectiveness of any interface in testing before going to market.

Prototypes · Product Design & Testing · Product Optimization · Human Factor Studies

Combine biosensors and uncover real human responses

The iMotions software integrates eye tracking, facial expression analysis, electrocardiogram and more to generate a more complete snapshot of human behavior. Click on each icon to learn more about the sensors.

iMotions provided the opportunity for us to stay agile. Whenever some nice, new hardware was available, we were able to follow that and still use iMotions.

Dr. Jesper Ejdorf Brøsted
Dr. Jesper Ejdorf Brøsted
Psychologist, MA, Ph.D - Force Technology

Want to know more?

Read our collection of articles within human behavior research

undefined
Publications

Read publications made possible with iMotions

undefined
Blog

Follow our blog contributions from our expert PhDs

undefined
Newsletter

A monthly close up of latest product and research news