Dr. Robert Atkinson

Associate Professor, Arizona State University
Director of the Advancing Next Generation Learning Environments (ANGLE) Lab
Director of the Innovative Learner & User Experience (iLUX) Lab


  • Increased number of studies can be published because the software enables a shift from programming during study preparation to actually conducting research
  • It usually only takes a day to train an undergraduate student how to set up and run a study on his/her own because the software is so easy to use
  • Allows the personnel budget to shift from programmers to experimenters, enabling more research to occur and, as a result, more dissemination opportunities


Arizona State University (ASU) is currently working on affective computing research and trying to discover new ways of detecting changes in people’s affective states, which would enable them to change learning environments (e.g., the difficulty level) in order to improve learning outcomes. The goal is to create a closed-loop dynamic system that can adjust itself based on the changes of the user’s affective states. For this kind of research ASU is using a multimodal sensor suite.

Before using iMotions, researchers at ASU were wrestling with the issue of integration and synchronization of sensor data. They began using eye-tracking technology 13 years ago but soon discovered that data regarding gaze tracking and visual attention is even more powerful when combined with sensors that could detect affect. The first sensor they decided to integrate with was an EEG headset, which posed fairly complicated data synchronization and integration programming issues in order to return useful metrics for data analysis. Graduate level Computer Science students were then brought into the research effort and were able to synchronize the data but only for a fixed set of equipment. Their efforts and innovation resulted solving the synchronization issues, but provided very limited data visualization and annotation functionality. Shortly after incorporating the low end EEG headset, other sensors and high quality EEG headsets were brought into the lab, which required even more resources and skills to integrate and synchronize the multimodal data sources if they were to keep building their own solution.


Since adopting iMotions, ASU has published more than ten conference proceedings and manuscripts and have an additional ten in progress that were supported by the software. Ninety percent of all the work done in both ASU labs involves iMotions, which minimizes programming efforts. “iMotions helps to increase the number of publications. We are able to put more resources into paying students to conduct the research that is important to the lab and larger research community.  Since the iMotions design and interface are so easy and intuitive, undergraduates without technological background can run it. With iMotions I am able to have a very high volume of students working with it,” says Dr. Robert Atkinson, Associate Professor at Arizona State University. Researchers from the ASU lab are also able to decrease the amount of time they need to spend on training new research assistants because the software is very user-friendly and easy to use.


ASU’s initial research setup consisted of computer and an eye tracker, to which they later added on an EEG headset and GSR sensor. By purchasing iMotions software, they suddenly had the possibility to plug and play with various sensors and to switch between devices as much as they wanted without facing calibration and synchronization issues. Since they were very pleased with the outcome of combining sensors, they wanted to take it to the next level and increase the sophistication of their lab. Today ASU is running two different state-of-the-art labs, both that rely on the iMotions software to help combine sensor suites consisting of eye tracking (fixed and mobile), EEG, mobile stands and GSR equipment.

With the help of iMotions, ASU researchers can now easily integrate different sensors and receive fully synchronized high quality data. Because they don’t have to invest a large amount of their funding in Computer Science students that were previously dedicated to this task, more funding is available to focus on addressing important research questions rather than the technical issues of data synchronization.


“iMotions enables us to do more research in a more efficient manner and with a relatively small budget. Most importantly, I can focus on addressing my research questions instead of fighting with the integration and synchronization issues involved with using multiple high-tech sensors. It really is a powerful tool.”
Contact us