The future will be smarter than us. New technologies coupled with new techniques are bringing about increasingly responsive devices. All of this is changing the way we live to make it safer, easier, and more personal.

Biometric research resides at the heart of this. Human data is essential to not only informing how the world should be changed, but often directly informing the devices that we interact with about our emotional state and behavior. The machines of the future will be smart, but they will also be more human.

Below, we have five examples of the intersection between biometrics, and the future of technology. All of these promise to shape the world we live in, by using human data with new or old devices. These aren’t merely speculative science-fiction wishes, but real-world examples that are already happening – to some extent – today.

1. Driving and Drowsiness

tired driver research

In the US alone, there are approximately 72,000 car crashes – and 6,000 fatalities – a year due to drowsiness. While the future of autonomous vehicles will surely put a halt to these numbers, we are still many years away from a widespread rollout of driverless cars.

In addition to this, there are many other environments in which sleepiness is antithetical to safe functioning (think pilots, machine operators, air traffic control, etc). By providing a system that intervenes when the user becomes too drowsy, accidents can be avoided and lives can be saved. Certainly a thing to strive for, but how do we do it?

One of the principal approaches is with eye tracking. Anti-Sleep is a solution offered by Smart Eye, a high-end eye tracking company (and one of our partners). The device monitors the level of drowsiness that can be measured from the movement and behavior of the eyes. The device can then produce signals that can trigger other events, such as an alarm to warn the driver.

“The accuracy of eye trackers means that the signals – the eye movement – can be reliably recorded, meaning that the data can be trusted. This is essential for not only new technologies that use these signals, but for human behavior research in general”, says Dr. Elvira Fischer, Lead Product Specialist at iMotions.

Another method utilizes ECG measurements from a device embedded inside the car seat, rendering it entirely non-invasive and unobtrusive. The device can accurately determine the levels of drowsiness, which can then trigger other events, as with eye tracking.

2. Virtual Therapy

 virtual reality therapy

The World Health Organization (WHO) states that 1 in 4 people worldwide are affected by mental health issues. This is clearly a widespread issue that is in need of attention and effort for treatments, and biometric tools are well-suited to informing such treatments.

New treatments have also been made possible by several new technologies, perhaps chief among them, virtual reality (VR). VR offers a new route to tackle the mental health crisis, being increasingly employed in situations to treat the following (amongst other disorders, illnesses, etc):

Coupled with biometrics, VR can not only be employed as a more flexible method of presenting environmental stimuli (such as a stressful situation that can be relived in a safe setting), but also integrated into the treatment.

Biofeedback with EEG devices offers one way to do this, helping patients hone their responses to stimuli with their own bodily signals in mind. Such training has been shown to be useful in helping patients overcome generalized anxiety disorder (GAD), and attention deficit hyperactivity disorder (ADHD).

Other biometrics also provide ways of incorporating bodily signals into treatment. Data from GSR, and ECG can (and has been used to) provide information about how someone is feeling throughout treatments. The data can then inform the success (or failure) of the treatment to change these signals to a normal, functional level.

3. Emotional Intelligence

face emotion recognition

Adapting the environment to how we feel is a task that can be readily be completed with biometric tools. Information about an individual’s emotional state is inherently captured through biometric measurements, and it simply remains to convert these signals into an appropriate action.

This has been completed with the choice of music dependent on mental state, with a music player deciding which songs to play based on facial expressions. This has also been similarly applied while respondents drive a car, and in a similar manner but with EEG. The practical application of such an approach can also be completed with other media formats, such as TV shows.

Emotions can also be involved even within media. A game called Nevermind uses facial expression analysis (from Affectiva) to track the player’s facial expressions, making the game more stressful in response to the player’s fear levels.

Another study has shown how children with autism (a mental disorder that often presents with a reduced ability to correctly identify emotions), can be aided with the use of Google Glass, and Affectiva. The Google Glass device can display the emotion of whoever is being looked at, through the analysis provided by Affectiva. With this information at hand, the individual with autism is able to receive help in determining the emotions of those around them.

4. More Human Learning

 teaching research

Humans are largely visual creatures. The way we learn is often visual, so it therefore follows that knowing more about our visual attention during the learning process can be used to determine better learning strategies.

This is increasingly prevalent within the medical profession, where efficient and productive training is particularly critical to treatment outcomes. This has been used to show how experienced practitioners see the world, giving information to beginners about where they should focus their attention.

By highlighting the most critical components of the process (which parts of a chart to pay most attention to, for example), the student can focus on what really matters in specific contexts.

As more and more education takes place within a virtual context, understanding this novel environment in order to present the best learning possibilities also becomes increasingly important. Emotions have been measured through facial expression analysis in such a virtual learning environment, and this information has guided how virtual teachers can best match their students.

The researchers found that virtual tutors that displayed congruent facial expressions were associated with better learning performance – a lesson in itself for designing online learning paradigms. Ultimately, the interaction between the way we learn, and the information from biometric devices, will continue to provide more efficient paths to knowledge.

5. Brains Interacting

brain to brain commnunication

Work on brain-computer interfaces (BCIs) might have begun in the 1970’s, but the research and application of these methods has recently shifted the scope of possibilities to completely new heights. While previous research used experimentation on animals or invasive methods, researchers have recently shown how non-invasive measurement of human brains can accurately instruct computers.

This has huge implications for individuals who have movement difficulties, by expanding the ways in which they can interact with the world. By sending signals from EEG to devices such as robotic arms, the device can be programmed to react in predefined ways – enabling individuals to use machines as their assistants.

Dr. Tue Hvass, Customer Success Manager at iMotions, states how “EEG is a relatively accessible solution to recording brain activity – the data is invaluable for those looking to learn more about the way we think and act. BCIs have repeatedly shown this in action”.

Similar techniques have also been applied to perform the world’s first human brain-to-brain communication, in which Neuroelectrics’ EEG was used to transmit messages by transcranial magnetic stimulation (TMS), a device which sends magnetic pulses to stimulate parts of the brain. This paves the way for increasingly advanced brain-to-brain communication to take place.

These are of course just five examples of how biometrics are shaping future technology – several more are already in progress, and even more will inevitably arise. All of this will result in a world that is better suited to humans, not only because technology will be better at doing things for us, but because it will also be better at knowing us.

I hope you’ve enjoyed reading about the intertwined future of technology and biometrics. If you’d like to learn more about how this technology can help your research or business understand humans better, then get in touch, or download our free and comprehensive guide to human behavior below.


Human Behavior Pocket Guide Insert