5 Basics of EEG 101: Data Collection, Processing & Analysis
When it comes to the analysis of EEG data, you might easily feel overwhelmed by the huge variety of pre-processing steps all of which require informed decisions with regard to the expected effects on the data.
In this blog post, we would like to shed some light on 5 key aspects that are crucial for EEG data processing.
Make informed decisions
Attenuate or reject artifacts
EEG experiments require careful preparation. You need to prepare the participants, spend some time on setting up the equipment and run initial tests. You certainly do not want your EEG experiment to fail mid-test, so before carrying out a full study with 100 participants start small and run some pilot sessions in order to check if everything is working properly.
- Are the stimuli presented in the right order?
- Are mouse and keyboard up and running?
- Do participants understand the instructions?
- Do you receive signals?
Once you have crossed those questions off your list, you are all set to start with the actual data collection and analysis.
2) “There is no substitute for clean data”
Wise words of Prof. Steve Luck (UC Irvine) that you should keep in mind whenever you record and pre-process EEG data in order to extract metrics of interest.
To this day, there is no algorithm that is able to decontaminate poorly recorded data, and you simply cannot clean up or process data in a way that magically alters the signal. Therefore, always start with properly recorded data.
EEG systems generally offer soft- or hardware-based quality indicators such as impedance panels where the impedance of each electrode is visualized graphically.
Green colors and low impedance values imply high recording quality (low impedances indicate that the recorded signal reflects the processes inside of the head rather than artifactual processes from the surroundings).
Clean data allows clean responses to your research questions!
3) Make informed decisions
EEG data can be recorded and analyzed in a near-infinite amount of different ways, and not only the processing steps themselves but also their sequence matters. All signal processing techniques alter the data to some extent and being aware of their impact on the data definitely helps to pick the right ones.
The phrase “making informed decisions” is the key –if you are hesitant about which methods to choose, check out existing literature. Most certainly, you will find valuable advice in scientific research papers or even in the “lab traditions” of your team.
By making sure that the methods of choice return the desired outcomes, you are able to maximize scientific research standards such as objectivity, reliability, and validity.
4) Attenuate or reject artifacts
EEG data contains relevant and irrelevant aspects. What is a signal to one EEG expert might be noise to another (and vice versa).
For example, one might be interested in event-related potentials time-locked to the onset of a specific visual stimulus. If the participant blinks at that very moment, the EEG might not reflect the cortical processes of seeing the stimulus on screen.
As an EEG expert, you might tend to exclude this trial from the analysis since the EEG data does not contain relevant information. However, if blinking occurs systematically during stimulus onset throughout the experiment, this might tell an interesting story. Maybe the participant avoids seeing a potentially threatening picture. Rejecting all trials where blinks occur basically results in a drastic reduction of data (it very well could happen that only 10 trials out of 100 are left – imagine this!).
Therefore, attenuation procedures based on statistical procedures such as regression or interpolation (e.g., the method proposed by Gratton, Coles & Donchin, 1983) or Independent Component Analysis might be more appropriate. In this case, contaminated data portions are replaced with interpolated data using surrounding data channels or time points (in the image above the red lines represent the corrected signal).
Unfortunately, the discussion on whether artifacts should be attenuated or rejected is ongoing in the scientific community, and you might have to evaluate which procedures return the desired output signal of interest.
However, combining scalp EEG with other sensors such as eye trackers, EMG, or ECG electrodes helps to collect physiological processes such as blinks and muscle movements of limbs or the heart through other modalities, making it easier to identify their intrusion into the EEG data.
5) Go for the right statistics
When designing and analyzing an EEG experiment, it is always recommendable to base your procedures on known material. You certainly will find it easier to explain the observed effects if you are able to link your results to existing publications where a comparable statistical procedure has been used.
As mentioned above, making informed decisions also applies in the context of selecting the right statistical procedures. In case you intend to investigate event-related potentials (ERPs), you might want to have a closer look into the latencies and amplitudes of the peaks in the ERP waveforms at certain electrode locations. By contrast, if you are interested in frequency-based measures such as theta, alpha, beta band power, you rather focus on the examination of the peak frequency within the band of interest.
EEG metrics such as “cognitive load” (Advanced Brain Monitoring B-Alert) or “frustration” (Emotiv EPOC) are either premised on time- or frequency-domain features of the EEG data, and can also be analyzed in view of peak amplitudes or latencies with respect to the onset of a certain event.
Analysis techniques comprise simple t-tests and more complex ANOVAs (Analysis of Variance) as well as non-parametric procedures such as bootstrapping or randomization techniques. The latter is particularly useful when you would like to examine the data in an explorative way without specifying the expected effect with respect to electrode site, latency or amplitude.
Fortunately, the complexity of running and analyzing EEG experiments can be significantly simplified by piloting, collecting clean data, and making informed decisions along the way of pre-processing and statistically analyzing the data.
Do not hesitate to talk to us here at iMotions if you would like to enrich your research endeavor with EEG and other physiological sensors. We will be happy to provide you with the necessary tools and information to get you started with the collection of high-quality data in no time! Contact us here