Hardware Specifications

Designed for the real world, the Tobii Pro Glasses third-generation wearable eye tracker allows you to conduct behavioral research in a wide range of settings. Tobii Pro Glasses 3 delivers robust eye tracking and accurate gaze data while giving users the freedom to move and interact naturally.

The Tobii Pro Eye Tracking Glasses 3 are compatible with the iMotions Eye Tracking Glasses Module.

Model
Tobii Pro Glasses 3
Sample rate
50 Hz or 100 Hz
Calibration procedure
1 point
Binocular eye tracking
Yes
Parallax compensation
Automatic
Slippage compensation
Yes, 3D eye tracking mode
Pupil measurement
Yes, absolute measure
Material
Grilamid plastic, stainless steel, optical grade plastic lenses
Nose pads
Grilamid plastic, with clip on attachments
Scene camera format and resolution
1920 x 1080 @25 fps
Scene camera, field of view (diagonal)
106 deg. 16:9 format
Scene camera, field of view (horizontal and vertical)
95 deg. horizontal / 63 deg. vertical
Weight
76.5 grams including cable
Frame dimensions
153 ×168 ×51 mm
Cable length
1200 mm
Audio
16 bit mono, integrated microphone
Design characteristics
Light-weight and discreet
Number of eye tracking sensors
4 sensors (2 per eye)
Fixed geometry
Yes
Sensors
Gyroscope, Acceleromete, Magnetometer: ST™ LSM9DS1 (sampled at 100 Hz)
Battery recording time
105 min.
Battery type
Rechargeable 18650 Li-ion, Capacity: 3 400 mAh
Storage media
SD (SDXC, SDHC) card
Connectors
Micro USB, RJ45 (Ethernet), 3.5 mm jack (sync port), head unit connector
Tobii Pro Glasses 3
Navigation-eyetracking

Powerful software

for powerful research

To analyze the eye tracking data you’ve gathered from eye tracking glasses like the Tobii Pro Glasses 3, you’ll need software that can provide the precision and accuracy your study requires. With the iMotions Eye Tracking Glasses Module, you can analyze your real-world eye tracking data with metrics such as:

  • Heatmaps
  • Gaze replays
  • Areas of interest (AOI)
  • Time to first fixation
  • Automated gaze-mapping: converts gaze from dynamic environments into static scenes for simpler aggregation and analysis