Online Data Collection
Powering quick, cost-effective research across the globe
Online Data Collection Overview
The iMotions Online Data Collection Module
Bring your research online
iMotions Online Data Collection uses a browser interface and a participant’s own webcam to collect data from Facial Expression Analysis and Eye Tracking, which can be combined with online surveying. Data then can be processed and analyzed through the iMotions Desktop Solution, maintaining integrity without compromising full, flexible functionality.
Research goals and design
An example of how it work
To understand the response between two different sets of marketing materials for a car company, we sent out a link that subjects could view and give their response to images and videos. To understand the change that Ad exposure creates a visual perception board is presented in between the stimuli as well as a shopping task to see how behavior on the ads translates to shopping behavior assess for self report, eye tracking, and facial expression coding.
Web Marketplace attention sample
ecommerce example use
- Strong attention to first object in the list than trails off right to left and top to bottom
- No facial engagement in non social task
Online Data Collection
Our collaboration with iMotions made it possible for us to transition to online data collection while keeping our research study timeline as planned during COVID-19 lockdowns. Built upon the iMotions' Lab solution, the Online Data Collection module gives us the flexibility to go as narrow or wide as the study warrants within geography, ethnicity or gender. We have found it very useful as a supplement to our premises' lab assets with quick turnaround time as data is collected in parallel remotely via webcam.
Now compatible with SONA Systems participant management platform
Recruit panelists, disseminate online study links, and collect responses with the easy-to-use experiment builder, which is completely integrated between iMotions and SONA Systems.
Limitations and considerations
- Oversample by 2x to account for attrition
- A computer with a webcam, best on a stable surface like a desk not on the lap
- Good lighting on the face, indoors, and best if the light source is in front not backlit or side-lit as the shadows can affect the facial coding algorithm
- No mobile phones or tablets
- Google Chrome is recommended, Firefox and Edge can also be supported, but not Safari
- Not all stimuli types, images, videos, some websites, and Qualtrics
Frequently Asked Questions about Online Data Collection
Expand for full FAQs
- What type of stimuli are supported?Currently we support images, video, audio, and websites through a browser interface. Some browser restrictions apply. You also have the option to insert a survey.
- What kinds of stimuli are most or least likely to elicit visible facial expressions?Facial expressions have evolved as a social signal, so that means that certain stimuli are better at eliciting facial expressions. Videos and emotive stimuli are more likely to elicit reactions than text for example. For text, website browsing, & reading, you might get instances of negative affect like brow furrow which may represent concentration, since it is not often that you are laughing while reading, so for these typesof stimuli, the responses will be more attenuated and may even be skewed towards negative if so. Take care to understand which facial action units you are analyzing, their probability thresholds for the specific study you are conducting, etc.
- How accurate is webcam eye tracking?Webcam eye tracking inherently has inferior accuracy compared to infrared eye tracking hardware, so lighting conditions, camera positioning, and other factors contribute greatly to the quality of your eye tracking data. It is crucial to oversample your participant pool to deliver the yield you are looking for.
- How sensitive are facial expression analysis & eye tracking to lighting conditions?In addition to a camera and microphone check (if you choose to record audio), Online Data Collection features an eye and face detection check to register the presence of the respondent on the webcam frame. Webcam eye tracking is highly reliant on adequate lighting, so please make sure to communicate to your participants that they should be well-lit and well-positioned in front of the screen.
- Is it possible to get pupil dilation out of the webcam eye tracking data?We do not encourage conducting pupillometry studies with webcam eye tracking because of controlling for light, which can even affect tightly monitored lab settings. This is due to the fact that any changes in light will affect pupil dilation in addition to any arousal you’re looking to measure, so getting good data is hard compared to the cleaner data you can glean with infrared trackers in highly controlled labs.
- What about other sensors like EEG or Heart Rate? Can I integrate those?For ODC, we have focused on FEA & eye tracking because of their ease of setup for participants. We are exploring wearable sensors like bluetooth devices to monitor things like Heart Rate, also in conjunction with our Mobile Research Platform.
- Is it possible to manually move between stimuli?Yes! You may designate either timed transitions or manual advance (ie, “click to move to the next section”) logic into your Online Data Collection study.
- Are there requirements for Windows vs Mac computers?For participants, all they will need is a webcam and a browser on a desktop or laptop setup (currently, we do not support data collection from tablets or smartphones), so both a PC and Mac are compatible. However, the experimenter conducting and analyzing the study will need to run the iMotions software on a PC.
- Are there specific requirements for participants with regards to wifi connection, laptop specifications, or webcam quality?Since ODC is browser-based, a browser and webcam are the most important elements, so there is no need for high-quality laptop specs or webcams like HD or 4K cameras – a simple webcam built into your PC or laptop is great, with a minimum 640 x 480 resolution and 30 fps. It is really the lighting conditions that are most important. However, participants should have secure internet connections to prevent against data loss during the study.
- What are my analysis options? Can I get and export raw data for eye tracking or facial action units?You can feed your facial expression analysis recordings into iMotions for post-processing with Affectiva or RealEyes, which gives you access to full, transparent R-based FEA metrics. A range of eye tracking metrics are also available including heatmaps, dwell time, etc. You can also export summary metrics and sensor data in a text file with data logged with timestamps that is easy to import into excel or statistical programs!
- Are the algorithms used for analysis transparent? How much of a “black box” will I run into?Affectiva, one of our FEA engines, has published accuracy measures in their literature & conference proceedings, but of course some aspects of their algorithms are not transparent because of intellectual property and copyright. For eye tracking, we have our own algorithms, so we have made our code written in R available with citations and explanations of methodology. We are also working specifically on whitepapers and theoretical documentation for publishers in relation to webcam eye tracking.
Online Data Collection module starts at 2,900 USD*
Sound like something for your research?
*Pricing is customized based on your industry and relies on whether you already have the iMotions software. The ODC Module itself starts from 2,900 USD incl. 1000 respondents.
Want more information?
If you are not yet an iMotions customer, contact us in the form below on how to get started with behavioral research including online studies. General iMotions pricing models can be found on our Pricing Page.
Or download the one-pager for the module description & system requirements.