attuned to humans
with the world’s leading biosensor platform
Create successful technology with data from humans
Through machine learning and cutting-edge processing, computational models can now be trained to react and interact with humans in ways that are increasingly attuned to our physiological output.
Collect and process human data
The ubiquity of computers in everyday life has brought about the need for computers to better react to human behavior. This use of human data in computational models allows for the creation of machines that are more closely adapted to human responses.
This data can also be used to validate current models, by providing a human benchmark against which computers can be tested. Biosensor data provides data for each of these scenarios, and can be readily collected with ease in iMotions.
Examples of what clients are currently using iMotions for include:
- Incorporating emotion detection into machine learning models of languages
- Modelling human task error rates with biosensor data
- Integrating EEG data with computer use to create adaptive systems
- Training recommender systems with emotion data
iMotions is used at over 50 of the 100 top ranking universities in the world
Create richer computational models
Leverage physiological data in real-time
Technology such as eye trackers, EEG headsets, facial expression analysis and EDA devices, allows accurate quantification of physiological data that is easily communicated to computational models.
Combining these modalities provides an even richer and detailed dataset from which models can create conclusions.
The iMotions software is a complete experimental platform that allows the entire process to be run, from study design and stimulus presentation, to data collection and export or analysis.
Key features to simplify and amplify your work
Connect. Record. Process.
Award winning Game Design Research
Dr. Alessandro Canossa at Northeastern University conducts sophisticated player behavior studies using iMotions in a fraction of the time as compared to before iMotions. He’s won a best paper award for this work.
Research made possible with iMotions
Carnegie Mellon University – Hai Pham, Paul Pu Liang, Thomas Manzini, Louis-Philippe Morency, Barnabás Póczos
Abstract: “Multimodal sentiment analysis is a core research area that studies speaker sentiment expressed from the language, visual, and acoustic modalities…”
Want to know more?
Read our collection of articles within human behavior research
iMotions is used in over 70 countries
Our offices are situated worldwide in close contact with our customers