Eye tracking is about where we look, what we look at, how much time we spend looking at it, how our pupils react to different kinds of visual stimulation and when we blink.

Put most simply, eye tracking refers to the measurement of eye activity. More specifically, eye tracking implies the recording of eye position (point of gaze) and movement on a 2D screen or in 3D environments based on the optical tracking of corneal reflections to assess visual attention. While the idea of eye tracking is quite straightforward, the technology behind it might strike you as rather complex and inscrutable.

No need to hit the panic button. The following pages are packed with all the need-to-knows and useful tools to help you get a solid grasp of eye tracking technology and best practices.

N.B. this post is an excerpt from our Eye Tracking Pocket Guide. You can download your free copy below and get even more insights into the world of Eye Tracking.

The technology behind eye tracking

How exactly does eye tracking work?

Eye tracking on the rise. While early devices were highly obtrusive and involved unduly cumbersome procedures, eye trackers have undergone quite a technological revolution in recent years. Long gone are the rigid experimental setups and seating arrangements you might think of.

Modern eye trackers are hardly any larger than smart phones and provide an extremely natural experience for respondents. Remote, non-intrusive methods now render eye tracking an easy to use, accessible tool in human behavior research that allows to objectively measure eye movements in real-time.

Modern day eye trackers

Most modern eye trackers utilize near-infrared technology along with a high-resolution camera (or other optical sensor) to track gaze direction. The underlying concept, commonly referred to as Pupil Center Corneal Reflection (PCCR), is actually rather simple. The math behind it is …well, a bit more complex. We won‘t bore you with the nature of algorithms at this point.

Here‘s the bottom line of how it works: Near-infrared light is directed toward the center of the eyes (pupil) causing visible reflections in the cornea (outer-most optical element of the eye), which are tracked by a camera.

Visible light eye tracking algorithm

Why infrared spectrum imaging?

Fair enough to ask. Let‘s see why visible imaging spectrum holds difficulties and should only be second choice in eye tracking technology.

The accuracy of eye movement measurement heavily relies on a clear demarcation of the pupil and detection of corneal reflection. While the visible spectrum is likely to generate uncontrolled specular reflection, illuminating the eye with infrared light, which is not perceivable by the human eye, renders the demarcation of the pupil and the iris an easy task – while the light directly enters the pupil, it just „bounces off“ the iris.

Remote and mobile eye tracking

There are two types of eye tracker: Remote (also called screen-based or desktop) and head-mounted (also called mobile).

Remote eye trackers

remote eye tracking

  • Record eye movements at a distance (no attachments to respondent)
  • Mounted below or placed close to a computer or screen
  • Respondent is seated in front of eye tracker
  • Recommended for observations of any screen-based stimulus material in lab settings such as pictures, videos, and websites, offline stimuli (magazines, books etc.), and other small settings (small shelf studies etc.)


Head-mounted eye trackers

Head-mounted eye tracking

  • Record eye activity from a close range
  • Mounted onto lightweight eyeglass frames
  • Respondent is able to walk around freely
  • Recommended for observations of objects and task performance in any real-life or virtual environments (usability studies, product testing etc.)

Who uses eye tracking?

Not to spoil your anticipation, but you may be surprised to discover that eye tracking is not exactly a novelty as such – in fact, it has been around for many years in psychological research.

Given the tight relationship between eye movements and human cognition, it makes intuitive sense to utilize eye tracking as experimental method to gain insight into eye movement-related brain activity.

Go to publications of research made with iMotions eye tracking technology to see examples.

Now fast forward. Who exactly uses eye tracking and why? Have a look at the most common applications in academic and commercial research.

Eye tracking delivers unmatched value to market research

mobileWhy is it that some products make an impression on customers while others just don‘t get it right? Eye tracking has become a popular, increasingly vital tool in market research. Many leading brands actively utilize eye tracking to assess customer attention to key messages and advertising (TV and print) as well as to evaluate product performance, product and package design, and overall customer experience. When applied to in-store testing, eye tracking informs about ease and difficulty of store navigation, search behavior, and path to purchase.

Eye tracking can help gain deep insights into Human Computer Interaction (HCI)

So what is Human Computer Interaction? Basically, HCI refers to any kind of interaction between a human and a computer of some sort. Think laptops, tablets, smart phones, simulators. Think websites, mobile apps, virtual reality. You get the idea.

Website testing

One emerging field utilizing eye tracking as methodology is usability and user experience testing. Eye tracking for website testing is a classic. How do people attend to real estate, communication, and calls to action (CTA)? If you‘re losing out on revenue, eye tracking data can deliver valuable insights into the gaze patterns of your website visitors – how long does it take them to find a specific product on your site, which kind of visual information do they ignore (but are supposed to see)? Cut to the chase and see what exactly goes wrong. Similar applications can be applied to mobile apps on tablets and smartphones.

website testing eye tracking



Driving behavior research utilizes head-mounted eye tracking technology combined with a number of other biometric sensors in a virtual reality (simulator) to gain a better understanding of human behavior in hazardous situations while driving. Where do drivers look when they face obstacles on the street? How does talking on the phone affect driving behavior? How exactly does speeding compromise visual attention? Insights of that kind can help improve hazard awareness and, for example, be applied to safe driving trainings and the development of further electronic driving aids.

Automotive research has embraced head mounted eye tracking for a long time to gauge drivers‘ visual attention – both with respect to navigation and dashboard layout. In the near future automobiles might be able to behave responsively toward drivers’ eye gaze, eye movements or the pupil dilation.

Neuroscience & psychology thrive on eye tracking

How do expectations shape the way we see the world? If you see a picture of a living room, you most likely know where to expect the TV (probably somewhere opposite the couch). If it’s in another spot, you might be baffled and gaze around the scene since your „scene semantics” (your “rules” of how a living room should look like) are violated.

Neuroscience and psychology utilize eye tracking to analyze the sequence of gaze patterns to gain deeper insights into cognitive processes underlying attention, learning, and memory. Another research strand addresses how we encode and recall faces – where do we look to extract the emotional state of others? Eyes and mouth are the most important cues, but there’s definitely a lot more to it. Finally, eye tracking provides insights into word processing, particularly how eye movements during reading are affected by the emotional content of the texts.

social interaction picture eye tracking
Learning & education can benefit from eye tracking

Seriously, what if learning would be an equally satisfying experience for all of us? What exactly does it take to turn learning into a successful adventure? Within the recent years, eye tracking technology has impressively made its way into educational science to help gain insights into learning behavior in diverse settings spanning from traditional „chalk and talk“ teaching approaches to digital learning.

Analyzing visual attention of students during classroom education, for example, delivers valuable information in regard to which elements catch and hold interest, which are distractive or fail to be seen – do students read or rather scan slides? Do they focus on the teacher or concentrate on their notes? Does their gaze move around in the classroom? Eye tracking findings like these can be effectively leveraged to enhance instructional design and materials for an improved learning experience in the classroom and beyond.

Eye tracking is used in medical research to study a wide variety of neurological and psychiatric conditions

Eye tracking in combination with conventional research methods or other biometric sensors can help assess and diagnose neurological diseases such as Attention Deficit Hyperactivity Disorder (ADHD), Autism Spectrum Disorder (ASD), Obsessive Compulsive Disorder (OCD), Schizophrenia, Parkinson‘s, and Alzheimer‘s disease. Besides, eye tracking technology can be leveraged to detect states of drowsiness or support multiple other fields for medical use, quality assurance or monitoring.

Gaming and UX – how come eye tracking is the big hit among a growing number of web designers and developers?

Eye tracking has recently been introduced into the gaming industry and since then has become an increasingly prominent tool as designers now are able to assess and quantify measures such as visual attention and reaction to key moments during game play to improve overall gaming experience.

When combined with other biometric sensors, designers can utilize the data to measure emotional and cognitive responses to gaming. New trends and developments will presumably render it possible soon to take an active part and control the game based on pupil dilation and eye movements.

Read customer success stories of customers using iMotions and find out what it can be used for!

What to look for in eye tracking data?

Like no other experimental method, eye tracking renders it possible to quantify visual attention as it objectively monitors where, when, and what people look at. So much for the hard facts. Now let‘s get practical and have a look at the most common metrics used in eye tracking research and what you can make of them.

Fixation and gaze points

gaze patternWithout doubt, the terms fixation and gaze points are the most prominent metrics in eye tracking literature. Gaze points constitute the basic unit of measure – one gaze point equals one raw sample captured by the eye tracker. The math is easy: If the eye tracker measures 60 times a second, then each gaze point represents 16.67 milliseconds. If a series of gaze points happens to be close in time and range, the resulting gaze cluster denotes a fixation, a period in which our eyes are locked toward a specific object. Typically, the fixation duration is 100 – 300 milliseconds.

The eye movements between fixations are known as saccades. What are they exactly? Take reading a book, for example. While reading, your eyes don’t move smoothly across the line. Instead, your eyes jump and pause, thereby generating a vast number of saccades. The visual span refers to how much text we can cover between fixations – on average, saccadic movements span 7 to 9 characters along the line of text. Trained readers have a higher visual span compared to early readers. Typically, saccades are measured in angle velocity.

Now imagine watching clouds in the sky as you pass your time waiting at the bus stop. Here, expect your eye movements to be quite the opposite as your eyes steadily follow the moving clouds. Unlike reading, locking your eyes toward a moving object won’t generate any obvious saccades, but a smooth pursuit trajectory.

As fixations and saccades are excellent measures of visual attention and interest, research in this field is experiencing a significant growth.

Heat maps

Heat maps are static or dynamic aggregations of gaze points and fixations revealing the distribution of visual attention. Following an easy-to-read color-coded scheme, heat maps serve as an excellent method to visualize which elements of the stimulus were able to draw attention – while red areas suggest a high number of gaze points and therefore an increased level of interest, yellow and green areas point toward flattening visual attention.

heat maps eye tracking


Here you can see the aggregated eye tracking replay on a packaging study:

Fixation sequences

Based on fixation position (where?) and timing information (when?) you can generate a fixation sequence. Dependent on where respondents look and how much time they spend, you can build an order of attention telling you where respondents looked first, second, third etc. This is a commonly used marker in eye tracking research since it reflects salient elements (elements that stand out in terms of brightness, hue, saturation etc.) in the display or environment that are likely to catch attention. AOIs which respondents look at first are typically visually more appealing (more salient) and are therefore of more interest.

Fixation sequences

Respondent count

The respondent count allows to extract information about how many of your respondents actually guided their gaze towards a specific AOI. A higher count might indicate that fixations and gaze points are driven rather by external aspects in the stimulus material.

Areas of Interest (AOI)

Areas of Interest, also referred to as AOI, are user-defined subregions of a displayed stimulus. Extracting metrics for separate AOIs might come in handy when evaluating the performance of two or more specific areas in the same video, picture, website or program interface.

Areas of interest (AOI)

Time to First Fixation (TTFF)

The time to first fixation indicates the amount of time it takes a respondent to look at a specific AOI from stimulus onset. TTFF can indicate both bottom-up stimulus driven searches (a flashy company label catching immediate attention, for example) as well as top-down attention driven searches (respondents actively decide to focus on certain elements or areas on a website, for example). TTFF is a basic yet very valuable metric in eye tracking.

Time to first fixation

Time spent

Time spent quantifies the amount of time that respondents have spent on an AOI. Since respondents have to blend out other stimuli in the visual periphery that could be equally interesting, time spent often indexes motivation and conscious attention (long prevalence at a certain region clearly point to a high level of interest, while shorter prevalence times indicate that other areas on screen or in the environment might be more catchy).

Advanced eye tracking metrics

With the core tools at hand, you‘re perfectly equipped to track the basics. Where, when and what do people look at? What do they fail to see? So far, so good. Now how about pushing your insights a bit further and stepping beyond the essentials of eye tracking? Want to peek beneath the surface? Sweet! Advanced metrics can help reveal emotional arousal and valence.

These 4 eye tracking metrics should definitely make it into your toolkit so you can draw the bigger picture.

Pupil size / dilation

Pupil size primarily responds to changes in light (ambient light) or stimulus material (e.g. video stimulus). However, if the experiment can account for light, other attributes can be derived from changes in pupil size. Two common properties are emotional arousal and cognitive workload. An increase in pupil size is referred to as pupil dilation, a decrease in size is called pupil constriction. In most cases pupillary responses are used as a measure for emotional arousal. However, be careful with rash conclusions as pupillary responses alone don’t give any indication of whether arousal arises from a positive („yay“!) or negative stimulus („nay!“).

Pupil size eye tracking


Distance to the screen

Along with pupil size, eye trackers also measure the distance to the screen and the relative position of the respondent. Leaning forwards or backwards in front of a remote device is tracked directly and reflects approach-avoidance behavior. However, keep in mind that interpreting the data is always very specific to the application.

Ocular Vergence

Most eye trackers measure the positions of the left and right eyes independently. This allows the extraction of vergence, i.e., whether left and right eyes move together or apart from each other. This phenomenon is just a natural consequence of focusing near and far. Divergence often happens when our mind drifts away, when losing focus or concentration. It can be picked up instantly by measuring inter-pupil distance. Advanced metrics can help reveal emotional arousal and valence.


Eye tracking can also provide essential information on cognitive load by monitoring blinks. Cognitively very demanding tasks can be associated with delays in blinks, the so-called cognitive blink. However, many other assertions can be derived from blinks. A very low frequency of blinks, for example, is usually associated with higher levels of concentration. A rather high frequency argues for drowsiness and lower levels of focus and concentration.

Get the most from eye tracking

Why combine eye tracking with other biometric sensors?

Take up again the milk bottle situation. Although you looked straight at the bottle, you didn‘t realize it was standing in front of you. Based on eye tracking alone we would have argued that, since your gaze was directed toward the item, you must have seen it.

The most intuitive way to validate our assumption might be to just ask you – “did you see the milk bottle?”. Primarily due to the relatively low costs, surveys are in fact a very common research tool to consolidate gaze data with self-reports of personal feelings, thoughts, and attitudes. Of course, the power of self-reports is somewhat limited when it comes to the disclosure of sensitive personal information (alcohol, drugs, sexual behavior etc.). Also, when working with self-reports, keep in mind that any delay between action and recollection introduces artifacts – asking immediately after closing the fridge might yield a different response (“nope, I didn‘t see it!”) compared to one week later (“…um, I don‘t remember!”).

Take your study from ordinary to extraordinary

We could have used camera-based facial expression analysis to monitor your emotional valence while staring into the fridge, for example. A confused expression while scanning all items in the fridge followed by a sad expression when closing the door would have told us immediately that you obviously didn‘t realize the milk bottle was standing within reach. On the other hand, a slight smirk would have indicated that your search had been successful. Further, we could have evaluated the amount of your emotional arousal and stress levels based on the changes in skin conductance (GSR, EDA) or your heart rate (ECG).

On top, we could have used EEG to capture your cognitive and motivational state as it is the ideal tool to identify fluctuations in workload (“have I looked everywhere?”), engagement (“jeez, I have to find this bottle!”), and drowsiness levels (“are you kidding me? I need coffee asap!”).

Admittedly, our example pictures a simplification of the interpretation of physiological data. In most research scenarios you will have to consider and control for many more factors that might have a significant impact on the parameters you aim to select.

Each sensor reveals a specific aspect of human cognition, emotion, and behavior. Depending on your individual research question, consider to combine eye tracking with two or more additional biosensors in order to gain meaningful insights into the spatio-temporal dynamics of attention, emotion, and motivation.

What‘s the gain?

Sensor combinations with eye tracking

The true power of eye tracking unfolds as it is combined with other sources of data to measure complex dependent variables.

These 5 biometric sensors are a perfect complement to eye tracking. Which metrics can be extracted from the different systems?

Have a look.

ECG & PPG: Electrocardiography (ECG) and Photoplethysmography (PPG) allow for recording of heart rate (HR), or pulse. Get insights into respondents’ physical state, anxiety and stress levels (arousal), and how changes in physiological state relate to their actions and decisions.

Facial Expression Analysis: Facial expression analysis is a non-intrusive method to assess both emotions (subconscious reactions prior to feelings – typically small movements in face muscles) and feelings (conscious reactions occurring after emotions – typically more visible muscle movements). While facial expressions can measure the presence (valence) of an emotion/feeling, they can´t measure the power of that emotion/feeling (arousal).

EEG: Electroencephalography is a neuroimaging technique measuring electrical activity on the scalp. EEG tells which parts of the brain are active during task performance or stimulus exposure. Analyze brain dynamics of engagement (arousal), motivation, frustration, cognitive workload and other metrics associated with stimulus processing, action preparation, and execution. EEG provides the quickest response of all biometrics sensors.

GSR (EDA): Galvanic skin response (or electrodermal activity) monitors „emotional” sweat secretion on hands or feet. Skin conductance offers insights into the respondents’ subconscious arousal when being confronted with emotionally loaded stimulus material.

EMG: Electromyographic sensors monitor the electric energy generated by bodily movements (e.g., of the face, hands or fingers). Use EMG to monitor muscular responses to any type of stimulus material to extract even subtle activation patterns associated with emotional expressions (facial EMG) or consciously controlled hand/finger movements.


Best practices in eye tracking

Frankly, failure or complications in studies most often occur due to small mistakes that could have easily been avoided. Often they happen because researchers and staff just didn‘t know about the basic knacks to avoid running into issues.

Conducting an eye tracking study involves juggling with a lot of moving parts. A complex experimental design, new respondents, different technologies, different hardware pieces, different operators. Let that sink in for a few moments. On top, you‘re under pressure to achieve a great study outcome. Unless you‘re an old hand in research, it admittedly can get quite challenging at times. We’ve all been there.

Don‘t worry, we‘ve got your back. The following are our 6 safe bets for a smooth lab experience in eye tracking research.

eye tracking best practices

1. Environment and lighting conditions
Have a dedicated space for running your study. Find an isolated room that is not used by others so you can keep your experimental setup as constant as possible. Make sure to place all system components on a table that doesn‘t wobble or shift.
For eye tracking, lighting conditions are essential. Avoid direct sunlight coming through windows (close the blinds!) as sunlight contains infrared light that will impact the quality of the eye tracking measurements.

Avoid brightly lit rooms (no overhead light). Ideally, use ambient light. Ensure to keep the physical environment constant.
If your study is designed to run several hours, split the experiment into two equally long sessions and calibrate the system for each session separately. Also, be aware that long experiments might cause dry eyes, resulting in impeding drift. Try to keep noise from the surrounding environment (rooms, corridors, streets) at a minimum as it most likely will distract the respondent and affect measurement accuracy.

2. Work with dual screen configuration
Work with a dual screen configuration – one „respondent screen“ for stimulus presentation (which ideally remains black until stimulus material pops up) and one „operator screen“ (which the respondent should not be able to see) to control the experiment and monitor data acquisition. A dual screen setup allows you to detect any issues with the equipment during the experiment.

3. Clean your computer before getting started

  • Clean your computer from things you don‘t need
  • Disable anything securing your computer, e.g. make sure to turn off your antivirus software so it doesn‘t pop up during the experiment and uses CPU resources
  • If not needed, disconnect your computer from the internet during data collection
  • Disable your screen saver
  • Disable any pop ups that could disturb the experiment

4. Ensure all people involved are properly trained
It is essential that people who start at the lab are trained on the systems used so that they have a certain level of knowledge that allows them to run a study smoothly. Generally, trainings are important for any kind of position in the lab. Having to train people during the testing process is a disadvantage and usually requires more time and effort than needed with solid training beforehand.

5. Always use protocols
Always have protocols! Anything that you need to instruct somebody on or any documentation that is associated with setting up and/or running a study at the lab is the most important thing to have. Try to have templates for every step of the research process. Literally. Don‘t underestimate the importance of documentation in institutions such as a university. It‘s common practice for research assistants to switch labs after a certain amount of time – protocols are true lifesavers as they keep track of anything from management to study execution and therefore make sure that new lab members can jump right in and are able to perform on the fly in line with lab routines.

6. Simplify your technology setup
Chances are you need a couple of different biometric sensors to run your study. To make sure that they interact well and are compatible with each other, use as few vendors as possible for both hard- and software. In the ideal setup everything would be integrated in one single software. Having to switch between different operating systems or different computers can cause difficulties. Keep in mind that it‘s easier to train lab members on a single software than on multiple. The equation is simple: Having a single software platform decreases the amount of training needed, simplifies the setup and takes out the risk of human error. Also, in case of problems and support issues it is more convenient to deal with one vendor and have a direct contact person than to be pushed around between vendors because nobody feels responsible.

N.B. This is an excerpt of our free white paper “Eye Tracking – The Pocket Guide”. To get the full guide in an easier to read PDF file format click here.

Equipment 101: Hardware

At this point you might figure that all eye tracking systems are pretty much the same. Their only job is to track where people are looking, so what‘s left to vary? Actually, quite a lot. Eye tracking is on the rise, and to keep pace with demand, new systems are shooting up like mushrooms. Amidst all manufacturer specifications it can be quite hard to keep the overview and evaluate which eye tracker is right for you.

Eye tracking hardware

Which eye tracker is right for you?

  1. Will your respondents be seated in front of a computer during the session? Go for a screen-based eye tracker. Do your respondents need to move freely in a natural setting or virtual reality? Choose a head-mounted system that allows for head and body mobility.
  2. Make sure the eye tracker you purchase meets the specifications required to answer your research objectives. Have a look at the key questions below that can help you find a suitable eye tracker.
  3. We have that covered already, but here it goes again: Even though less expensive, you rather stay away from eye trackers using ordinary webcams. Yes, we know the temptation of a good bargain – however, when it comes to eye trackers it‘s absolutely worth spending a bit extra money if you‘re aiming for high measurement accuracy.

Here are a few more questions to ask to get the ideal performance picture:

Measurement precision: Measured in degrees. Standard is about 0.5 degree. Low end hardware start around 1.0 degree, medium 0.5 degree, high end down to 0.1 degree or less (bite bar).

Sampling rate: How many times per second is the eye position measured? Typical value range is 30-60 Hz. Special research equipment samples at around 120-1000+ Hz.

Trackability: How much of the population can be tracked? The best systems track around 95% of the population, low end systems less.

Headbox size: To what extend is the respondent allowed to move in relative distance to the eye tracker? A good system will typically allow to move around 11 inches in each direction.

Recapture rate: How fast does the eye tracker detect the eye position after the eyes have been out of sight for a moment (e.g. during a blink)?

Integrated or standalone: Is the eye tracking hardware integrated into the monitor frame? Standalone eye trackers are more flexible, but typically a bit more complex to set up.

Does your provider offer support? Thanks to plug and play, you basically can run your eye tracker out of the box (that pertains to most eye trackers, at least). To get started, however, a live training is helpful to learn the ropes. Even along the way, a little expert advice oftentimes come in handy. Does your provider offer that kind of support? What about online support? And how long does it take them to reply when you need it most? Seriously, good support is gold par value.

Eye tracking software

Hardware is only half the battle – finding the right software solution.

Of course, hardware is only half the battle. Before you can kick off your eye tracking research, you definitely need to think about which recording and data analysis software to use. Usually, separate software is required for data acquisition and data processing. Although some manufacturers offer integrated solutions, you will most likely have to export the raw data to a dedicated analysis software for data inspection and further processing. So which eye tracking software solution is the one you need?

What are the usual struggles with eye tracking software?

Struggle 1: Eye tracking software either records or analyzes: Usually, separate software is necessary for data recording and data processing. Despite automated procedures, proper data handling requires careful manual checks along the way. This recommended checking procedure is time-consuming and prone to error.

Struggle 2: Eye tracking software is bound to specific eye trackers: Typically, eye tracking soft- and hardware are paired. One software is only compatible with one specific eye tracker, so if you want to mix and match devices or software even within one brand, you soon will hit the brick wall. Also, be aware that you need a separate software for remote and mobile eye trackers.

Not only does operating several software solutions require expert training beforehand, it might even prevent you from switching from one system to another. Worst case scenario? Your lab will stick to outdated trackers and programs even though the latest generation of devices and software might offer improved usability and extended functionality.

Struggle 3: Eye tracking software is limited to certain stimulus categories: Usually, eye tracking systems don´t allow the recording of eye movements in various experimental conditions. You will have to use one software for static images and videos, a different software for websites and screen captures, and yet another software for scenes or mobile tracking. What if there was one unified software solution for your biometric research? Go to iMotions eye tracking.

Struggle 4: Eye tracking software can be complex to use: Eye tracking software can be quite complex to use. You have to be familiar with all relevant software controlled settings for eye tracker sampling rate, calibration, gaze or saccade/fixation detection etc. In the analytic framework, you have to know how to generate heat maps, select Areas of Interest (AOIs) or place markers. Statistical knowledge is recommended to analyze and interpret the final results.

Struggle 5: Eye tracking software rarely supports different biometric sensors: Eye tracking software often is just that – it tracks eyes but rarely connects to other biometric sensors to monitor emotional arousal and valence.

What does that imply exactly? You will need to use different recording software for your multimodal research. As you most likely will have to set up each system individually, a considerable amount of technical skills is required. If you happen to be a tech whizz, you´re probably on the safe side. If not, you might be running into serious issues even before getting started. Also, be aware that you have to make sure that the different data streams are synchronized. Only then you can analyze how eye tracking, EEG, or GSR relate to each other.


What should a picture-perfect eye tracking software hold in store for you?

Ideally, your eye tracking software

  • connects to different eye tracking devices (mix and match, remember?)
  • is scalable correspondent to your research needs: It allows you to conveniently add other biometric sensors that capture cognitive, emotional or physiological processes
  • accommodates both data recording and data analysis
  • tracks various stimulus categories: Screen-based stimuli (video, images, websites, screen capture, mobile devices), in real-life environments (mobile eye tracking) as well as survey stimuli for self-reports
  • growths with you: It can be used equally by novices that are just getting started with eye tracking and expert users that know the ropes


single platform


Want to learn more? Get the full and free definitive guide to eye tracking here.

Contact us and see what we can do for you and your eye tracking needs.


Want the quick eye tracking overview?

We’ve created an eye tracking infographic for a fun and easily digestible overview of eye tracking:

eye tracking infographic


Share this Infographic On Your Site: