Powered by the iMotions platform, Stanford University has created one of the world’s most advanced simulators integrating in biosensors and other human behavior technologies.

The innovative use of physiological measurement in a simulated environment offers an even more in-depth look at how participants respond to the various events presented. We offer a comprehensive platform that will synchronize information from various sensors and combine that information, if necessary, with the simulator events and actions of the participant. By using the live stream from eye tracking glasses and a video feed of someone in the drivers seat, it’s now possible to collect gaze data from the dynamic situation posed by the simulation technology as well as the emotions derived from the facial expressions of the driver.

Companies such as Realtime Technologies (RTI) offers a multitude of simulators built to handle the observation of a limitless number of scenarios. From full-size, car simulators to portable systems offering a simulation environment. The simulators provide flexibility to accommodate the investigation of complex research questions in driving technology and behavior. These simulators are not only used for car research, but also for training of personnel in trains, planes and other modes of transport.


Data from a CAN bus in the car can also be included in real car environments and data from EEG, GSR, ECG, EMG, eye tracking and facial expression analysis can also be fully synchronized in order to best possible read human behavior while driving.

Car Simulator
In a recent study Attention Tool was used to collect data from ASL eye tracking glasses, facial expressions and CAN bus data from the car. The following example shows a sample taken from the study:

Attention Tool Graphs

The above image shows the synchronized feed combining the data coming in from multiple sources. For instance, we can see the emotions decoded from facial expression analysis combined with the simulator events, changes in pupil size in graphical form along with the video of the respondent and what they are looking at in the scenario.

You can see some additional examples as well a video of the integration from the Stanford Center for Design research about driving simulator research below:

If you’re interested in adding physiological data collection in your car simulator lab, contact us today so we can work out the best solution for your simulations.

“eye tracking guide insert“ width=