Working with Stimuli
A huge thanks to our beta testers
Ready to upgrade? Get the latest version of iMotions here.
Other features recently added to iMotions Lab
With iMotions 9.4 we are excited to introduce a brand new module – voice analysis. Powered by audEERING, the voice analysis module allows for analysis at every level of vocal production. Go deep into emotion analysis and collect data related to emotion detection (angry, happy, sad, neutral), and emotional valence (arousal, dominance, and valence). It’s also possible to explore fundamental voice features with metrics relating to prosody, including pitch, loudness, speaking rate, and intonation, as well as data regarding perceived gender and age. Any audio data – whether recorded within an experiment, or simply imported into iMotions, can be processed with audEERING’s voice analysis algorithms. All data processing takes place offline on your own hardware, ensuring full control of the data.
This module can also be easily combined with the new speech-to-text feature to assess the semantic value of words alongside the valence related to their production.
The integration of AssemblyAI’s speech recognition API into iMotions Lab enables users to import, transcribe, and analyze audio and video files. This feature supports multiple languages and offers capabilities like speaker detection, sentiment analysis, and speech summarization. Beneficial for diverse fields like academic research, market research, and customer experience, it allows comprehensive analysis of verbal data, aiding in deeper understanding of human communication and informed decision-making.
Read more about the feature here.
We are excited to release emotional heatmaps – gaze plots that color code what facial expression your respondents made when looking at the different areas of an image and other static visuals. Facial expression analysis is calculated and aggregated for each static stimulus – any of those metrics can be selected to form the basis of the emotional heatmap. This can provide a quick and singular overview of how participants’ facial expressions change while they view an image. You can find more information about implementing emotional heatmaps in the Help Center article here.
EEG intersubject correlation calculation
You can now calculate and export EEG intersubject correlation scores. This is available as a new R notebook, and provides insights about how well synchronized the EEG signals are from multiple participants. This metric has been well-established in research and is seeing increasing use and deployment within a range of new disciplines and industries. The metric is particularly valuable when assessing the shared level of engagement between participants while they watch or listen to stimuli. Examples of neural synchrony being applied in real world research are available in this report available here, and in the Help Center article here (customers only).
A variety of improvements and additions have been made to several eye tracking glasses systems. This includes updated import options for Pupil Labs studies, helping to streamline the research workflow. Gyroscope, accelerometer, pitch, and roll data are now available for Neon by Pupil Labs. You can now also import both Raw Sensor Data and Timeseries Data downloaded from Pupil Cloud into iMotions. Finally, it’s also possible to re-import data from Pupil Cloud with fisheye correction. Additionally, the Viewpointsystem VPS19 glasses now support the exporting of .mp4 videos, and updates to the UI now also provide improved information when importing eye tracking glasses data from the wrong folder, smoothing out study processes.
You can download our Eye Tracking Glasses Pocket Guide here.
WebET 3.0 brings big improvements to accuracy
With the update to WebET 3.0 our webcam eye tracking algorithm is now 4x more accurate than previous version, and is now also a lot more robust in different environments.
Upcoming Features and Improvements
Even though iMotions 10 marks a milestone in the development of iMotions Lab there is still much more to come. General updates and improvements are released once or twice a month, and you can keep track of these from the release notes. Make sure to sign up to our newsletter to be alerted as soon as new big features are released