Table of Contents
With the latest iOS accessibility update, Apple quietly introduced a new capability: camera-based eye tracking. The feature allows users to navigate and interact with their iPhone using only their gaze, which is a development that has sparked curiosity among researchers and technologists as a potential data collection avenue in the future.
It’s an exciting moment for mainstream adoption of human-behavior technology. Eye tracking, once limited to laboratories and research settings, is now appearing on devices millions of people already own.
But for researchers, UX teams, and neuroscientists, an important question follows:
Does this mean consumer smartphones are now ready to replace research-grade eye tracking systems, or even research-grade webcam eye tracking?
The short answer: Not yet…and possibly not for a long while.
What Eye Tracking Does on iPhone Today
Apple’s implementation is designed primarily for interaction, not data export.
The feature uses the iPhone’s front-facing camera to estimate gaze direction and translate it into interface control. Users can look at an element, pause, and trigger a tap — improving accessibility and enabling hands-free navigation.
So:
- How does eye tracking work on iPhone? Through the camera and on-device machine learning models that estimate where users are looking and convert that into UI input.
- What does eye tracking do on iPhone? It enables control and navigation – not measurement, analysis, or research-grade capture.
What It Doesn’t Do: Export or Research-Ready Data
While the technology is impressive and user-friendly, there are clear limitations for research use:
| Capability | iPhone System Eye Tracking | Professional Research Eye Tracking (e.g., in iMotions ecosystem) |
| Raw gaze data export | ❌ No | ✔ Available |
| Fixations, saccades, AOI metrics | ❌ No | ✔ Yes |
| Consistent sample rate reporting | ❌ Not provided | ✔ Standardized |
| Calibration control | ❌ Limited | ✔ User-controlled + validated |
| Multimodal data sync (EEG, GSR, ECG, facial expression, behavior) | ❌ Not supported | ✔ Fully supported |
Apple also states that gaze data from the accessibility feature remains on the device, meaning developers and researchers cannot access it for study, analysis, or scientific workflows.

Why This Development Still Matters
Even without export capabilities, the introduction of eye tracking on iPhone signals something important:
Eye tracking is moving from niche technology to everyday interaction.
History tells us what happens next. When cameras became standard in phones, we didn’t stop needing DSLRs, but we did start thinking differently about photography. Accessibility features like this often become stepping stones to broader adoption and future developer access.
For research fields, psychology, neuromarketing, UX, human-computer interaction, this shift expands awareness and possibility.
Where iMotions Fits Into the Future
At iMotions, we see this as the beginning of a convergence:
- Consumer devices will drive familiarity.
- Professional tools will continue to deliver accuracy, validation, and multimodal integration.
- And eventually, hybrid workflows will emerge, from remote early-stage screening to controlled lab-grade studies.
If and when Apple or third-party frameworks open access to raw gaze data, platforms like iMotions, built to compare, synchronize, and analyze multimodal signals, will be ready.
Until then, iPhone eye tracking is a milestone worth watching: not because it replaces research tools, but because it normalizes the idea that our technology should understand how we look, interact, and respond.
In Summary
The iPhone now supports eye tracking, and that’s a significant step for mainstream adoption. But today, its functionality is designed for accessibility and interaction, not scientific measurement or data export.
As the field evolves, the future of behavioral research will depend not just on capturing signals, but on analyzing them rigorously, combining them across modalities, and transforming them into insight.
That’s where expertise, methodology, and purpose-built research ecosystems still matter.
